Duplicate Files Search Options

By default, DupScout searches duplicate files using generic settings, which should be appropriate for most users. In addition, power computer users are provided with a number of advanced configuration options allowing one to customize duplicates detection process for user specific needs.

In order to customize the duplicate files search process, open the duplicate files search options dialog and select the 'General' tab. The 'General' options tab allows one to control the default report title, the type of the hash signature used to detect duplicate files, the maximum number of duplicate file sets to report about and provides the ability to enable processing and display duplicate files user names.

  • Report Title - this parameter sets the default report title for HTML, PDF, Excel CSV, text and XML reports. This option is especially useful for automatically generated reports, which are saved without user intervention.
  • Signature Type - this parameter sets the type of the hash signature algorithm that should be used to detect duplicate files, which can be set to one of the following values: MD5, SHA1 or SHA256. The SHA256 algorithm is the most reliable one and it is used by default. The MD5 and SHA1 algorithms are significantly faster, but less reliable.
  • Max Dup File Sets - this parameter controls the maximum number of duplicate file sets displayed in the duplicate files results list. After finishing the search process, DupScout will sort all the detected duplicate file sets by the amount of the duplicate disk space and display the top X duplicate file sets as specified by this parameter (default is 10,000).
  • Process User Names - this option enables processing and display of duplicate files user names. By default, due to performance considerations, processing and display of duplicate files user names is disabled. Select this option in order to enable categorization and filtering of detected duplicate files by the user name and display the amount of duplicate disk space and the number of duplicate files per user. In order to mitigate potential performance degradation when searching duplicate files over the network with the option to process and display duplicate files user names enabled, it is highly recommended to configure the duplicate files search operation to use at least 4 parallel processing threads.