File: options.conf

package info (click to toggle)
dupd 1.7.3-2
  • links: PTS, VCS
  • area: main
  • in suites:
  • size: 12,688 kB
  • sloc: ansic: 8,381; sh: 879; makefile: 131; perl: 58
file content (98 lines) | stat: -rw-r--r-- 2,922 bytes parent folder | download | duplicates (3)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98

.strict_options
.include main_opt.h

EXCLUDE_PATH=O:x,exclude-path:PATH::ignore duplicates under this path
PATH=O:p,path:PATH:opt_add_path:path where scanning will start
CUT=O:c,cut:PATHSEG::remove 'PATHSEG' from report paths
HLUQ=O:I,hardlink-is-unique:::ignore hard links as duplicates

[scan] scan starting from the given path
$$$PATH$$$
H:,nodb:::deprecated
O:,stats-file:FILE::save stats to this file
O:m,minsize:SIZE::min size of files to scan
O:,hidden:::include hidden files and dirs in scan
H:D,hdd:::deprecated
H:S,ssd:::deprecated
O:,buflimit:NAME::read buffer size cap
O:X,one-file-system:::for each path, stay in that filesystem
$$$HLUQ$$$
H:,file-count:NUM::max estimated number of files to scan
H:,no-thread-scan:::do scan phase in a single thread
H:,pathsep:CHAR::change internal path separator to CHAR
H:,firstblocks:N::max blocks to read in first hash pass
H:,firstblocksize:N::size of firstblocks to read
H:,intblocks:N::blocks to read in intermediate hash
H:,blocksize:N::size of regular blocks to read
H:,fileblocksize:N::size of blocks to read in file compare
H:,skip-two:::deprecated
H:,skip-three:::deprecated
H:,cmp-two:::force direct comparison of two files
H:,cmp-three:::deprecated
H:,uniques:::deprecated
H:,avg-size:::deprecated
H:,no-thread-hash:::obsoleted
H:,sort-by:NAME::testing
H:,x-nofie:::testing

[refresh] remove deleted files from the database

[report] show duplicate report from last scan
$$$CUT$$$
O:m,minsize:SIZE::min size of total duplicated space to report
O:,format:NAME::report output format (text, csv, json)

[file] based on report, check for duplicates of one file
O:f,file:PATH::check this file
$$$CUT$$$
$$$EXCLUDE_PATH$$$
$$$HLUQ$$$

[uniques] based on report, look for unique files
$$$PATH$$$
$$$CUT$$$
$$$EXCLUDE_PATH$$$
$$$HLUQ$$$

[dups] based on report, look for duplicate files
$$$PATH$$$
$$$CUT$$$
$$$EXCLUDE_PATH$$$
$$$HLUQ$$$

[ls] based on report, list info about every file seen
$$$PATH$$$
$$$CUT$$$
$$$EXCLUDE_PATH$$$
$$$HLUQ$$$

[rmsh] create shell script to delete all duplicates
O:L,link:::create symlinks for deleted files
O:H,hardlink:::create hard links for deleted files

[validate] revalidate all duplicates in db

[help] show brief usage info

[usage] show more extensive documentation

[man] show more extensive documentation

[license] show license info

[version] show version and exit

[H:testing] testing only, ignore

[GLOBAL]
O:F,hash:NAME::specify alternate hash function
O:v,verbose:::increase verbosity (may be repeated for more)
O:V,verbose-level:N::set verbosity level to N
O:q,quiet:::quiet, suppress all output except fatal errors
O:d,db:PATH::path to dupd database file
O:h,help:::show brief usage info
H:,no-unique:::ignore unique table even if present, for testing
H:,x-small-buffers:::for testing only, not useful otherwise
H:,x-testing:::for testing only, not useful otherwise
H:,log-only:::log only messages at chosen level