You need supercomputesr and super fast disks - I'm stuck with a deduped disk, it takes hours to retrieve certain files, sometimes freezed
After reading your comment, I went through all 4000+ files and accessed each one and all were instant access as if not depup'ed. You must have an 8086 or something if a supercomputer is needed
Sounds like a dying hard drive or data corruption maybe. I use dedup on my games SSD of all things. Almost no measurable loading time hits, or stuttering. Code: PS C:\Windows\system32> get-dedupstatus FreeSpace SavedSpace OptimizedFiles InPolicyFiles Volume --------- ---------- -------------- ------------- ------ 30.68 GB 49.35 GB 33494 33494 A: Only a 25% savings, but more than enough to make me happy. Had to unoptimize my storage drive moving from 8.1 to 10, so once I'm confident in dedup on 10 that drive will be optimized again.
Use FolderSizes, Run as Admin, look at System Volume Information, you will know what happens with deduped files and wonder certain limit (I'm not sure why and when with my case) the problem arises.
Sorry my ignorace... never used or seen this before. Anyway to install/adapt this feature in a pt-BR Windows 10 Pro? What's required? Thanks for posting moderate, and for any answers
Thanks abbodi1406 for the answer! buuuuut, the ignorance goes on and on.... Installed it.. like a charm btw, but then, what's next? There's a command I should run to get the job done? How to save the space? I've tried the one I've seen in a post "Get-DedupStatus" and it returned... nothing There's an user manual?
Thank you EFA! Just two more questions... This beauty can't be used in the system drive? I tried running it and it said it couldn't run in NTFS something blahblah... then I started it on my secundary hdd, which worked fine. I'm just worried about something I've read in the previous post... If ms releases a new rev. of Windows 10, my data will be unavailale till new deduplication package is released? :O
yes so far that is how its been. Data will be unusable until MDL releases an update for that. AND that depends on the Server 2016 stuff being around to do it, since that is where the files come from. Truly IMO the best method would be running Server 2012 R2, at the very least as a VM to attach to your deduped volume. To safeguard, I have an image of a current working Dedup setup just in case an update breaks it. Then I can revert back and take care of my data.
It may be useful for some here. SDFS 3.0.1 was released yesterday and is pretty usable now on Windows. It's inline (in contrary to Windows' built in one which is offline), and managed to get 153GB of ISO's (almost all Windows 10 RTM & 1511 Dutch & English) down to 28.1GB. I find it more usable than Windows dedup because SDFS is mounted as a separate volume, and it's chunkstore/hashdb is stored in a normal directory on a harddisk of choise. This enables easy backups as you just need to backup that directory, and if you use a directory sync tool like me it will only backup the changed and new chunks. This is way easier than the windows way of storing things in system-volume-information IMO. Also you can just take that directory and an xml config file and mount the volume on any machine with sdfs installed.
Guys, can you also extract failover clustering components? Specifically I need svhdxflt filter driver (part of cluster shared volume) to be able to use Shared VHD on windows 10. THX!
If you use it, make sure to pass --hash-type=VARIABLE_MURMUR3 if you want variable blocks (you usually want that if you're not deduping vms or backups) when creating a volume or you will be stuck with 4k fixed blocks and get a very bad dedup ratio. This should also be the default setting, but when not passing the param explicitly and let the "default thing" do it's work you have wrong values in the config.
Code: PS C:\Temp\Dedup10586> DDPEVAL.EXE D: Data Deduplication Savings Evaluation Tool Copyright (c) 2013 Microsoft Corporation. All Rights Reserved. Evaluated Target OS: Windows 10.0 Evaluated folder: D: Evaluated folder size: 1.84 TB Files in evaluated folder: 75694 Processed files: 48485 Processed files size: 1.84 TB Optimized files size: 1.43 TB Space savings: 423.47 GB Space savings percent: 22 Optimized files size (no compression): 1.46 TB Space savings (no compression): 386.71 GB Space savings percent (no compression): 20 Files excluded by policy: 27209 Small files (<32KB): 27209 -Is there a way to do compression & dedup ? -Why is DDPEVAL.EXE saying "no compression" ? - I don't see an option to test with compression. Thanks
dedup does compression. Compression over compression would actually add more overhead. AFAIK you cannot use both compression (compress this device) and dedup at the same time as they do not work together.
how come that we can access deduplicated volumes normally from windows 7 ultimate? therefor I like it a lot, amazing savings: btw, this guide should get a sticky definitely!
Unless I'm missing some novelties, You can access the deduplicated volumes but not the deduplicated files from win7/server 2008r2