md5hash
Re: md5hash
Good update!
About the hashes comparison: is it not already included a feature like this?
About the warning message: you have to decide in link with the variable that effectively increase processing time. If it's more relevant the file size, you can add a report-limit to it. If instead is more relevant the number of files, report it. Or finally a combination of them..
About the hashes comparison: is it not already included a feature like this?
About the warning message: you have to decide in link with the variable that effectively increase processing time. If it's more relevant the file size, you can add a report-limit to it. If instead is more relevant the number of files, report it. Or finally a combination of them..
Re: md5hash
I have made a small test:
570 jpg files (0.7 GB) - 1m 45s
286 mp3 files (1.57 GB) - 1m 25s
2 avi files (1.37 GB) - 0m 45s
As deducible, number of files is more relevant than file size. So you could implement warning messages like these:
1. "You are trying to check more than 600 files at once. It could need long time. Do you want to proceed?"
2. "You are trying to check more than 3 GB of files at once. It could need long time. Do you want to proceed?"
You could also add an option to disable these warning messages and enable listview to receive more than 100 files.
570 jpg files (0.7 GB) - 1m 45s
286 mp3 files (1.57 GB) - 1m 25s
2 avi files (1.37 GB) - 0m 45s
As deducible, number of files is more relevant than file size. So you could implement warning messages like these:
1. "You are trying to check more than 600 files at once. It could need long time. Do you want to proceed?"
2. "You are trying to check more than 3 GB of files at once. It could need long time. Do you want to proceed?"
You could also add an option to disable these warning messages and enable listview to receive more than 100 files.
Re: md5hash
Yes, but only if you drop two files at the same time. If you have to grab two files from different directories (two operations) the comparison is not performed.Lupo73 wrote:About the hashes comparison: is it not already included a feature like this?
Re: md5hash
I did some tests, and the search and file size calculation is negligible. This was a run for my Program Files directory:
The first time I ran it, the search took about 12 seconds, subsequent runs are about 3. That aside, the hit for the calculations for a normal group of files is insignificant. So it's reasonable to check both the number of files and the total data size.
Code: Select all
search time: 3.47256889418846 s
file count: 59976
size time: 6.47635427900614 s
total size: 7976.30958080292 MB
You can already do this, it is just not available from the GUI at present, it is an INI only setting (see the readme). I will add it. I'll also increase the default to 500 since the update with folders can dramatically increase the number of files hashed at once.You could also add an option to disable these warning messages and enable listview to receive more than 100 files.
Re: md5hash
Search and calc for C:\Windows:
First run:
Second run:
First run:
Code: Select all
search time: 12.5311912383062 s
file count: 51316
size time: 6.72805157386643 s
total size: 11449.3711023331 MB
Code: Select all
search time: 5.07242797390758 s
file count: 51316
size time: 6.82758760862348 s
total size: 11449.3711023331 MB
Re: md5hash
You could finally consider to create a window for options.You can already do this, it is just not available from the GUI at present, it is an INI only setting (see the readme). I will add it.
What you have used for tests? If since the second search&calc results are similar, the difference could be linked with memory allocation, array creation of something like these.
Anyway I think the best solution is to add both checking.
Re: md5hash
I tested with a small script that uses the same search function from md5hash, no hashing was involved, just the search and file size calculation. I'm going to check for both >500 files and >1 GB of data.
Nope, no new options GUI. It's all living in the menu.
Nope, no new options GUI. It's all living in the menu.
Re: md5hash
v1.0.3.5 is now available. Changes:
- Fixed open handles in hash.dll
- Added recursion into folders, configurable depth setting
- Added optional warning for large hashing jobs (>500 files or >1 GB of data)
- Added command to manually compare last two hashed files
Could someone please update the posting? Thanks!
- Fixed open handles in hash.dll
- Added recursion into folders, configurable depth setting
- Added optional warning for large hashing jobs (>500 files or >1 GB of data)
- Added command to manually compare last two hashed files
Could someone please update the posting? Thanks!
Re: md5hash
The database is update to v1.0.3.5
In What's New section, I did not delete the previous changelog. *Just add the new changelog above the old one* and then i add date so that user can see how actively the program is being develop.
Hope everone else can follow this format when they updating md5hash.
EDIT:
*request by wraithdu
In What's New section, I did not delete the previous changelog. *Just add the new changelog above the old one* and then i add date so that user can see how actively the program is being develop.
Hope everone else can follow this format when they updating md5hash.
EDIT:
*request by wraithdu
Re: md5hash
Very good! Is it possible to set no limits to Folder Search Depth? For example setting -1 (otherwise you could add something of similar, or add a checkbox in its window). [update: I saw that I can add a big limit, as 1000, so it isn't so important]
Eventually you could add an option that "Check for Duplicates", with the limitation of runs only for 100-200 files and a message that report it when the user active the feature. And you could add a light-red background to duplicate listview items (to easily found them). In this case I think the best solution is to report results ordered by hashes, to have duplicates near.
Thanks for all! Now I really like it
Eventually you could add an option that "Check for Duplicates", with the limitation of runs only for 100-200 files and a message that report it when the user active the feature. And you could add a light-red background to duplicate listview items (to easily found them). In this case I think the best solution is to report results ordered by hashes, to have duplicates near.
Thanks for all! Now I really like it
Re: md5hash
Actually -1 for a folder depth will disable the limit, I just forgot to document it. Oops. I'll update the input dialog.
Sorry though, not doing duplicate checking of any sort. That's not in the scope of this app. I mentioned before, there are plenty of duplicate file finders out there if that's what a user is interested in.
Sorry though, not doing duplicate checking of any sort. That's not in the scope of this app. I mentioned before, there are plenty of duplicate file finders out there if that's what a user is interested in.
Last edited by wraithdu on Fri Jan 15, 2010 12:19 pm, edited 1 time in total.
Re: md5hash
Ok, the package has been updated (no version change).
Re: md5hash
hi
the link is gone ,the website it's more working
is there a working link?
thanks
the link is gone ,the website it's more working
is there a working link?
thanks
Re: md5hash
You could try https://web.archive.org/web/20141120044 ... .php?id=12...giulia wrote: ↑the link is gone ,the website it's more working
is there a working link?
Re: md5hash
The link at the top of the portablefreeware site (first post) is working.