File: Handling_a_large_number_of_files.mdwn

package info (click to toggle)
git-annex 10.20230126-3
  • links: PTS, VCS
  • area: main
  • in suites: bookworm
  • size: 69,344 kB
  • sloc: haskell: 74,654; javascript: 9,103; sh: 1,304; makefile: 203; perl: 136; ansic: 44
file content (3 lines) | stat: -rw-r--r-- 405 bytes parent folder | download | duplicates (9)
1
2
3
I have noticed performance getting really slow when adding files (git annex add . ) to a directory already containing several hundred thousand files. When using git annex, is it more recommended to split large numbers of files into multiple directories containing fewer files? Is there a particular recommended way of handling large numbers of files (say getting into the millions) in git annex? 

Thanks