File: Handling_a_large_number_of_files.mdwn

package info (click to toggle)
git-annex 5.20151208-1~bpo8%2B1
  • links: PTS
  • area: main
  • in suites: jessie-backports
  • size: 48,312 kB
  • sloc: haskell: 48,234; sh: 1,540; ansic: 484; makefile: 267; perl: 145
file content (3 lines) | stat: -rw-r--r-- 405 bytes parent folder | download | duplicates (8)
1
2
3
I have noticed performance getting really slow when adding files (git annex add . ) to a directory already containing several hundred thousand files. When using git annex, is it more recommended to split large numbers of files into multiple directories containing fewer files? Is there a particular recommended way of handling large numbers of files (say getting into the millions) in git annex? 

Thanks