1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151
|
---
c: Copyright (C) Samuel Henrique <samueloph@debian.org>, Sergio Durigan Junior <sergiodj@debian.org> and many contributors, see the AUTHORS file.
SPDX-License-Identifier: curl
Title: wcurl
Section: 1
Source: wcurl
See-also:
- curl (1)
- trurl (1)
Added-in: n/a
---
# NAME
**wcurl** - a simple wrapper around curl to easily download files.
# SYNOPSIS
**wcurl \<URL\>...**
**wcurl [--curl-options \<CURL_OPTIONS\>]... [--dry-run] [--no-decode-filename] [-o|-O|--output \<PATH\>] [--] \<URL\>...**
**wcurl [--curl-options=\<CURL_OPTIONS\>]... [--dry-run] [--no-decode-filename] [--output=\<PATH\>] [--] \<URL\>...**
**wcurl -V|--version**
**wcurl -h|--help**
# DESCRIPTION
**wcurl** is a simple curl wrapper which lets you use curl to download files
without having to remember any parameters.
Call **wcurl** with a list of URLs you want to download and **wcurl**
picks sane defaults.
If you need anything more complex, you can provide any of curl's supported
parameters via the **--curl-options** option. Beware that you likely should be
using curl directly if your use case is not covered.
By default, **wcurl** does:
## * Percent-encode whitespace in URLs;
## * Download multiple URLs in parallel
if the installed curl's version is \>= 7.66.0 (--parallel);
## * Use a total number of 5 parallel connections to the same protocol + hostname + port number target
if the installed curl's version is \>= 8.16.0 (--parallel-max-host);
## * Follow redirects;
## * Automatically choose a filename as output;
## * Avoid overwriting files
if the installed curl's version is \>= 7.83.0 (--no-clobber);
## * Perform retries;
## * Set the downloaded file timestamp
to the value provided by the server, if available;
## * Default to https
if the URL does not contain any scheme;
## * Disable curl's URL globbing parser
so {} and [] characters in URLs are not treated specially;
## * Percent-decode the resulting filename;
## * Use 'index.html' as the default filename
if there is none in the URL.
# OPTIONS
## --curl-options, --curl-options=\<CURL_OPTIONS\>...
Specify extra options to be passed when invoking curl. May be specified more
than once.
## -o, -O, --output, --output=\<PATH\>
Use the provided output path instead of getting it from the URL. If multiple
URLs are provided, resulting files share the same name with a number appended to
the end (curl \>= 7.83.0). If this option is provided multiple times, only the
last value is considered.
## --no-decode-filename
Do not percent-decode the output filename, even if the percent-encoding in the
URL was done by **wcurl**, e.g.: The URL contained whitespace.
## --dry-run
Do not actually execute curl, print what would be invoked.
## -V, \--version
Print version information.
## -h, \--help
Print help message.
# CURL_OPTIONS
Any option supported by curl can be set here. This is not used by **wcurl**; it
is instead forwarded to the curl invocation.
# URL
URL to be downloaded. Anything that is not a parameter is considered
an URL. Whitespace is percent-encoded and the URL is passed to curl, which
then performs the parsing. May be specified more than once.
# EXAMPLES
Download a single file:
**wcurl example.com/filename.txt**
Download two files in parallel:
**wcurl example.com/filename1.txt example.com/filename2.txt**
Download a file passing the **--progress-bar** and **--http2** flags to curl:
**wcurl --curl-options="--progress-bar --http2" example.com/filename.txt**
* Resume from an interrupted download. The options necessary to resume the download (`--clobber --continue-at -`) must be the **last** options specified in `--curl-options`. Note that the only way to resume interrupted downloads is to allow wcurl to overwrite the destination file:
**wcurl --curl-options="--clobber --continue-at -" example.com/filename.txt**
Download multiple files without a limit of concurrent connections per host (the default limit is 5):
**wcurl --curl-options="--parallel-max-host 0" example.com/filename1.txt example.com/filename2.txt**
# AUTHORS
Samuel Henrique \<samueloph@debian.org\>
Sergio Durigan Junior \<sergiodj@debian.org\>
and many contributors, see the AUTHORS file.
# REPORTING BUGS
If you experience any problems with **wcurl** that you do not experience with
curl, submit an issue on GitHub: https://github.com/curl/wcurl
# COPYRIGHT
**wcurl** is licensed under the curl license
|