File: s3_bandwidth_limitations_and_next_release.mdwn

package info (click to toggle)
git-annex 6.20170101-1%2Bdeb9u2
  • links: PTS
  • area: main
  • in suites: stretch
  • size: 50,088 kB
  • sloc: haskell: 53,116; sh: 1,582; ansic: 341; makefile: 292; perl: 144
file content (7 lines) | stat: -rw-r--r-- 651 bytes parent folder | download | duplicates (8)
1
2
3
4
5
6
7
Is there a way to set bandwidth limits for [[special_remotes/s3]]?

From what i can see in the [[todo/credentials-less_access_to_s3]] patch, the `downloadUrl` function is used, does that mean that the `annex.web-download-command` is used? If that's the case, it's great because it means we can use the `--bwlimit` parameter in `wget` to limit transfers.

But what about uploads? Are there limits there as well?

I'll also abuse this forum to see if/when it will be possible to have a shiny new release to ship that amazing new feature? There seems to be sufficient stuff piled up in the unreleased changelog to warrant a release, no? :) --[[anarcat]]