File: Handling_a_large_number_of_files.mdwn

package info (click to toggle)
git-annex 7.20190129-3
  • links: PTS, VCS
  • area: main
  • in suites: buster
  • size: 56,292 kB
  • sloc: haskell: 59,105; sh: 1,255; makefile: 225; perl: 136; ansic: 44
file content (3 lines) | stat: -rw-r--r-- 405 bytes parent folder | download | duplicates (8)
1
2
3
I have noticed performance getting really slow when adding files (git annex add . ) to a directory already containing several hundred thousand files. When using git annex, is it more recommended to split large numbers of files into multiple directories containing fewer files? Is there a particular recommended way of handling large numbers of files (say getting into the millions) in git annex? 

Thanks