Need to find duplicate file names without scanning files

The Lurker

Fully [H]
Joined
Jul 1, 2001
Messages
19,085
I have a seafile server. As a result of a few operations it has created duplicate files and the only difference is that it added a (1) to each duplicate. I need to search for these duplicate files quicky. I can have the seafile library connected to the computer in two ways, windows cloud drive api or a mapped network drive. However, I need to find the duplicates without scanning the file content only the file name. If the software scans the content it automatically downloads the file to the desktop. That will be 100 gb of unessesary transfer.

That is why I need a way to look at filenames only. I cannot find a piece of software that will do that. They all want to do some content analysis.

I'm hoping you guys have some suggestions because right now I'm almost thinking maybe extracting a file list and identifying the dupes in excel and then using that list to do a delete.
 
Digital Volcano has a product with lots of options for detecting duplicates. I use the paid version but they also have a free version.
 
If it's not comparing the actual files, how does it determine they're actually duplicates? Purely based on the name? Are the duplicates always in the same directory?

I would think you could pretty easily write a little Python script to handle this, if you can mount the file server as a volume.
 
If it's not comparing the actual files, how does it determine they're actually duplicates? Purely based on the name? Are the duplicates always in the same directory?

I would think you could pretty easily write a little Python script to handle this, if you can mount the file server as a volume.
Most of the apps have options to check only by file name, and then you can set a threshold for exact or close.
 
If it's not comparing the actual files, how does it determine they're actually duplicates? Purely based on the name? Are the duplicates always in the same directory?

I would think you could pretty easily write a little Python script to handle this, if you can mount the file server as a volume.
Duplicates are all in the same directory. I cant compare actual files because it starts to download them and so far trying DupeGuru and Auslogics, both download the files to compare them.

I can mount the drive as a windows mapped drive but I dont know python.

Windows search? Just put (1) in the search bar at the top level directory
This is what I did and I found a substantial amount, but I am also trying to avoid false positives. That's why I wanted to do a more robust search of the file filenames. However, spot checking by hand it appears that just pulling out all the files with "(1)" appears to be accurate. But you never really know for sure.

Most of the apps have options to check only by file name, and then you can set a threshold for exact or close.
Thank you for the link with all those tools. I found and tried Auslogics and Dupeguru incidentally and those two do not have a filename only option. They just dump the files immedietely as they start the compare. I will however try Alldup as that one, right in the screenshot, has a filename option.
 
AllDup seems to be exactly what I need, which is file names only and it doesnt sync the files to the computer. I just need to figure out how to configure it correctly now.
 
Back
Top