I have a folder with a large number of duplicate files (~ 800 or so) which is newer than an older folder. Both contain a large number of other music files that I should not delete.
Folder X: Total music folder
Folder Y: (Old Music (File Date is Newer than X) + New Music)
The newer date is causing issues for the typical fdupes routine, and I’d rather not do literally 800+ GUI commands to correct it.
Is there any way to automatically remove the duplicate files from folder Y?
I’ve been using rdfind (in the AUR) to delete duplicate files (by size, not name) for years. It is very fast, especially if the files are large as it goes through a process of eliminating from checksum analysis all non-duplicate sized files, then removes from analysis files which are of identical size but don’t have identical bytes at the start or end, and only then does a checksum on the remaining files which have identical sizes and start/end bytes.
A command to delete duplicate files in a newer folder would be something like this: