File: s3_bandwidth_limitations_and_next_release.mdwn

package info (click to toggle)
git-annex 10.20230126-3
  • links: PTS, VCS
  • area: main
  • in suites: bookworm
  • size: 69,344 kB
  • sloc: haskell: 74,654; javascript: 9,103; sh: 1,304; makefile: 203; perl: 136; ansic: 44
file content (7 lines) | stat: -rw-r--r-- 651 bytes parent folder | download | duplicates (8)
1
2
3
4
5
6
7
Is there a way to set bandwidth limits for [[special_remotes/s3]]?

From what i can see in the [[todo/credentials-less_access_to_s3]] patch, the `downloadUrl` function is used, does that mean that the `annex.web-download-command` is used? If that's the case, it's great because it means we can use the `--bwlimit` parameter in `wget` to limit transfers.

But what about uploads? Are there limits there as well?

I'll also abuse this forum to see if/when it will be possible to have a shiny new release to ship that amazing new feature? There seems to be sufficient stuff piled up in the unreleased changelog to warrant a release, no? :) --[[anarcat]]