-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Description
If all files of a folder have dupes in another folder, the output can get very verbose and it's not exactly clear from looking at it. It would be very helpful if fddf could summarize that as folder dupes (or subset).
Because the primary use case for me is figuring out which files I can/should delete. If I could decide on the level of folders that would reduce the time it takes to sort through all the dupes.
Btw, here's a result I got, it took 12 mins and consumed 70 MB RAM on Win 8.1 64 bit. Most files in that folder are small files (<100KB, and the larger ones aren't much larger):
Overall results:
16963 groups of duplicate files
32744 files are duplicates
1.2 GiB of space taken by dupliates
Metadata
Metadata
Assignees
Labels
No labels