curl

copy a URL

                    _   _ ___  _
Project         ___| | | |  _\| |
               / __| | | | |_)| |
              | (__| |_| | _ <| |___
               \___|\___/|_|\_\_____|
curl [options] [URL...]

copy data from or to a server, using one DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP without user interaction.

Offers features like proxy support, user authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer resume and more.

As with all the documentaiton on this site, this is an tersified version
THE complete documentation is at cURL.haxx.se

URL

The URL syntax is protocol-dependent.
Specify multiple URLs or parts of URLs by writing part sets within braces as in:

http://site.{one,two,three}.com or get sequences of alphanumeric series by using [] as in:

ftp://ftp.numericals.com/file[1-100].txt
ftp://me:secret@ftp.myserver.com/file[001-100].txt
(with leading zeros)
ftp://ftp.letters.com/file[a-z].txt

Nested sequences are not supported, use several ones next to each other:

http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html

Multiple URLs will be fetched in a sequential manner in the specified order.
A step counter for the ranges to get every Nth number or letter:

http://www.numericals.com/file[1-100:10].txt
http://www.letters.com/file[a-z:2].txt

Without protocol:// prefix, curl infers the protocol, default to HTTP other protocols based on often-used host name prefixes. For example, host names starting with ftp use FTP://.

Attempts to re-use connections for multiple file transfers, so that getting many files from the same server will not do multiple connects / handshakes. files specified on a single command line

Progress

Options

boolean options are enabled with --option and disabled with --no-option.
--url URL Specify a URL to fetch. to specify URL(s) in a config file. To control where this URL is written, use the --output or the -O, --remotename .
may be used any number of times.
-#
--progress-bar
display progress as a simple progress bar instead of meter.
-s
--silent
Silent mode. Don't show progress meter or error messages.
-S
--show-error
When used with -s show an error message if it fails.
--connect-timeout seconds Maximum time to allow the connection to the server. Always use in case server is unavailable.
See --max-time
If used several times, the last is used.
--stderr file Redirect all writes to stderr to file . If the file name is '-', it is written to stdout.
If this option is used several times, the last one will be used.
-m
--max-time seconds
Maximum time for entire operation. Prevents hanging due to slow networks or links going down.
See also the --connect-timeout .
If used several times, the last one will be used.
-f
--fail
(HTTP) Fail silently (no output at all) on server errors. done to enable scripts to deal with failed attempts.
Return error 22.
Not fail-safe and there are occasions where nonsuccessful response codes will slip through, especially when authentication is involved (response codes 401 and 407).
-g
--globoff
{}[] are not treated as meta characters in URLs but must be encoded
HTTP specific options:
-A
--user-agent agent string
User-Agent string to send to the HTTP server.
? Some CGIs fail if this field isn't set to "Mozilla/4.0". ?
To encode blanks in the string, use quotes .
Can be set with -H,--header
If set more than once, the last one will used.
--anyauth determine authentication method and use the most secure one.
Instead of setting a specific authentication method with --basic, --digest, --ntlm and --negotiate.
Not recommended for uploads from stdin,
--basic use Basic authentication. default. Use when a previous option sets a different authentication method
-b
--cookie file|name=value[;…]
If no = symbol is present, it is treated as a filename to use to read previously stored cookie lines from, which be used in this session if they match. this activates the "cookie parser" which will make curl record incoming cookies
Useful with --location .
The format of the file should be plain HTTP headers or the Netscape/Mozilla cookie file format.

The file is only used as input. No cookies will be stored in the file.
To store cookies, use --cookie-jar or --dump-header
If set more than once, the last one will be the one that's used.

-c
--cookie-jar file
Write all cookies after a completed operation using the Netscape cookie file format. writes all cookies read from a specified file as well as all cookies received from server.
If no cookies are known, no file will be written. If you set the file name to a single dash, "-", the cookies will be written to stdout.

Activates the cookie engine that makes curl record and use cookies. See --cookie
If the cookie jar can't be created or written to the operation doesn't fail or report an error. Use -v to output a warning
If used several times, the last specified file name will be used.

-j
--junk-session-cookies
discards "session cookies" as if a new session is started.
-D
--dump-header file
Write the protocol headers to file.
HTTP/1.1 200 OK
Date: Mon, 10 Aug 2020 20:08:22 GMT
Server: Apache
Last-Modified: Sun, 12 Jul 2020 18:58:37 GMT
Accept-Ranges: bytes
Content-Length: 2975
Content-Type: text/html
Cookies from the headers could then be read in a second invocation using the --cookie
--cookie-jar is a better way to store cookies.

FTP: server response lines are considered "headers"
If used several times, the last one will be used.

-i
--include
Include HTTP-header in the output.
-I
--head
Fetch the HTTP-header only.
--compressed Request a compressed response and save the uncompressed document.
--tr-encoding Request a compressed Transfer-Encoding response and uncompress the data while receiving it.
-F
--form name={content|{@|<}file | -
emulate a form submission by issuing a POST data using the Content-Type multipart/form-data
Enables uploading of binary files etc.
@ attaches a file in the post as a file upload (ex: datafile=@input.dat).
< gets the content part from a file, which makes a text field.
To read content from stdin use - as the filename.
Specify the Content-Type with type=, for Example:
curl -F "web=@index.html;type=text/html" url.com
or
curl -F "name=daniel;type=text/foo" url.com

Change the name field of a file upload part by setting filename=, :
curl -F "file=@localfile;filename=nameinpost" url.com
Can be used multiple times.

--form-string name=string Similar to --form except leading @ and < characters, and the ;type= string in the value have no special meaning.
-d
--data @file|-|data
--data-acsii
Sends data in a POST request, in the same way that a browser does when a form is submitted, using the content-type application/x-www-form-urlencoded
To URLencode the value of a form field use --data-urlencode.
If used more than once, data will be merged with a separating &.
For example: using -d name=daniel -d skill=lousy generate a post 'name=daniel&skill=lousy'.
Use @file to read the data from fileor - from stdin.

The contents of the file must be URL-encoded.
Multiple files can also be specified.

--data-urlencode data This posts data, similar to the --data also performs URL-encoding.
To be CGI-compliant, data should begin with a name followed by a separator and a content specification.
The data part can be passed to curl using :
content URL-encode the content and pass that on.
=content URL-encode the content and pass that on. The preceding = symbol is not included in the data.
name=content URL-encode the content part and pass that on. name must be previously URLencoded
@filename load data from the given file (including any newlines), URL-encode that data and pass it on in the POST.
name@file load data from file (including any newlines), URL-encode that data and pass it on in the POST. The name part gets an equal sign appended, resulting in name=urlencoded-file-content. Note that the name is expected to be URL-encoded already.
--data-binary data posts data as specified with no processing
When @ prefixes a filename Data is posted in a similar manner as --data-ascii , except that newlines are preserved and conversions are not done.
Used several times, they are append as described in--data.
-G
--get
data specified with --data or --data-binary to be used in a HTTP GET request instead POST . The data will be appended to the URL with a ? separator.
With -I, the POST data will be appended to the URL with a HEAD request.
Only the first occurance is honored.
-H
--header header
Extra header to use with get.
Usng a internal header replaces it.
Remove an internal header by giving a replacement without content, example: -H "Host:".
To send a header with no-value, terminate it with a semicolon, example: -H "X-Custom-Header;" to send "X-Custom-Header:".
See the--user-agent and --referer .
Can be used multiple times to add/replace/remove multiple headers.
-J
--remote-header-name
use the server-specified Content-Disposition filename instead of extracting a filename from the URL.
-e
--referer URL
Sends the "Referer Page" information to the HTTP server. can be set with --header .
With --location append ;auto to the --referer URL to set the previous URL when it follows a Location: header.
;auto can be used alone, without an initial --referer.
If used several times, the last one will be used.
-L
--location
If the server reports that the requested page has moved to a different location (indicated with a Location: header and a 3XX response code), reissue the request on the new location.
With --include or --head, headers from all requested pages will be shown.
When authentication is used, curl only sends credentials to the initial host. If a redirect to a different host, it won't intercept the user+password. See also --location-trusted
Limit the amount of redirects to follow by using --max-redirs
If the HTTP response was 301, 302, or 303 a redirect and which is not a GET (i.e. POST or PUT), executes the next request with a GET .
If the response code was any other 3xx code, curl will re-send the following request using the same method.
--post301do not convert POST requests to GETs when following a 301 redirection. only with --location
--post302do not convert POST requests to GETs when following a 302 redirection. only with --location
--location-trusted Like --location, but sends the name + password to hosts redirect to. A security breach if the site redirects to a site to which sends authentication info (which is plaintext in the case of HTTP Basic authentication).
--max-redirs num maximum number of redirection-followings allowed with --location .
default: 50 . Don't use -1 to make it limitless.
If used several times, the last one will be used.
--ignore-content-length Ignore the Content-Length header. for Apache 1.x, which reports incorrect Content-Length for files larger than 2 gigabytes.
-E
--cert certificate[:password]
(SSL) use the specified client certificate file when getting a file with HTTPS, FTPS or another SSL-based protocol, PEM format.
Assumes a "certificate" file that is the private key and the private certificate concatenated! See --cert and --key to specify them independently.
If curl is built against the NSS SSL library this tells the nickname of the certificate to use within the NSS database defined by the environment variable SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded.
If to use a file from the current directory, precede it with ./
If used several times, the last one will be used.
--crlfile file (HTTPS/FTPS) Provide Certificate Revocation List in a file using PEM format with a certificates that are to be considered revoked.
--delegation LEVEL Set LEVEL to tell the server what it is allowed to delegate when it comes to user credentials. Used with GSS/kerberos.
none Don't allow any delegation.
policy Delegates if and only if the OK-AS-DELEGATE flag is set in the Kerberos service ticket, which is a matter of realm policy.
always Unconditionally allow the server to delegate.
--digest (HTTP) Enables HTTP Digest authentication. Prevents the password from being sent in clear text.
Use in combination with --user to set user name and password.
See --ntlm, --negotiate and --anyauth.
Only the first occurnce is used.
--ntlm (HTTP) Enables NTLM authentication.

To enable NTLM for proxy authentication use --proxy-ntlm.
Requires a library built with SSL support. Use --version
If is used several times, the following occurrences make no difference.

--ciphers list of ciphers (SSL) ciphers to use see SSL ciphers
NSS ciphers entry at NSSCipherSuite If used several times, the last one will override the others.
--engine name Select the OpenSSL crypto engine to use . Use list to output a list of build-time supported engines.
Build-time engines:
  <none>
--egd-file file (SSL) Specify the path name to the Entropy Gathering Daemon socket used to seed the random engine
See --random-file
--cert-type type (SSL) type of provided certificate : PEM, DER and ENG. defauult PEM
If used specified times, the last one will be used.
--cacert CA certificate (SSL) use the specified certificate file to verify the peer. Contains multiple CA certificates, in PEM format. overides the default file and $CURL_CA_BUNDLE the path to a CA cert bundle. The windows version looks for curl-ca-bundle.crt in the same directory as curl.exe, or in the Current Working Directory, or in any folder along PATH.
If curl is built against the NSS SSL library then this option tells curl the nickname of the CA certificate to use within the NSS database defined by the environment variable SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded.
If specified several times, the last one will be used.
--capath CA certificate directory (SSL) use the specified certificate directory to verify the peer. Multiple paths can be provided by separating them with ":" (e.g. "path1:path2:path3"). PEM format, if curl is built against OpenSSL, the directory must have been processed using the c_rehash utility supplied with OpenSSL. Using --capath can allow OpenSSL-powered curl to make SSL-connections much more efficiently than using --cacert if the --cacert file contains many CA certificates.
If it is specified several times, the last one will be used.

FTP

-l
--list-only
When listing a directory, nameonly view. Uses NLST
-I
--head
Displays file size and modification time
-T
--upload-file [file]
Transfers the local file to the remote URL using the same name if there is no file in the URL.
Use a trailing / on the last directory to use the same file name

Use one -T for each URL on the command line.

Each -T + URL pair specifies what to upload where.
Supports "globbing" of the file argument, permitting the uploading of multiple files to a single URL by using the same URL globbing style supported in the URL, For Example:
curl -T "{file1,file2}" http://www.uploadtothissite.com
or
curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/

Use - (a single dash) to use stdin.
Use . (dot) to use stdin in non-blocking mode to allow reading server output while stdin is being uploaded.

If this is used with HTTP(S) PUT will be used.

-R
--remote-time
Local file retains the timestamp of remote file.
-C
--continue-at offset
Continue/Resume a previous file transfer at the given offset.
With uploads, the FTP server command SIZE will not be used
Use "-C -" to determine where/how to resume the transfer. It then uses the given output/input files to figure that out.
If used several times, the last one will be used.
-a
--append
with an upload, append to the target file or create it.
--crlf Convert LF to CRLF in upload.
-B
--use-ascii
Enable ASCII transfer when using FTP or LDAP.
Can also be invoked by appending ;type=A to the URL.
causes data sent to stdout to be in text mode for win32 systems.
--disable-eprt
--no-eprt
disable EPRT and LPRT address and port specifing commands.
Normally first attempt uses EPRT(extended), then LPRT(long) finally PORT, --eprt enables EPRT

Disabling EPRT only changes the active behavior. to switch to passive mode to not use --ftp-port or force it with --ftp-pasv.

--disable-epsv
--no-epsv
disable EPSV when doing passive transfers. --epsv to reenable EPSV
Disabling EPSV only changes the passive behavior. to switch to active mode use --ftp-port.
-n
--netrc
Use .netrc (_netrc on Windows) in $HOME for login name and password.
Does not support macros.
--netrc-optional similar to --netrc, makes the .netrc usage optional
--netrc-file similar to --netrc, except provide the path to the netrc file that only specify one file per invocation.
If several are provided, the last one will be used.

Overrides any use of --netrc and --netrc-optional

--ftp-account [data] Response to "account data". Sent using the ACCT command.
If is used twice, the second is used.
--ftp-alternative-to-user command If authenticating with the USER and PASS commands fails, send command.
--ftp-create-dirs create missing directories.
--ftp-method [method] method to reach a file in a subdirectory.
multicwd a CWD command for each path part.
singlecwd one CWD with the full target directory then operates on the file
nocwd give a full path.
--ftp-pasv Use passive mode. Default, overrides a previous --ftp-port
Undoing an enforced passive really isn't doable, instead enforce the correct--ftpport .
Tries EPSV first and then PASV, unless--disable-epsv is used.
If used several times, the first is used.
--ftp-skip-pasv-ip ignore IP address the server suggests in response to PASV command when connectng the data connection.
Instead re-use the same IP address it already uses for the control connection.
This option has no effect if PORT, EPRT or EPSV is used instead of PASV.
--ftp-pret send a PRET command before PASV (and EPSV).
Some servers ( drftpd, ) require command for directory listings as well as up and downloads in PASV mode.
--krb level Enable Kerberos.
The level must be clear, safe, confidential, or private.
Requires library built with kerberos4 or GSSAPI (GSS-Negotiate) support.
Use --version to see if curl supports it.
If used several times, the last one will be used.
--ftp-ssl-ccc Use CCC (Clear Command Channel) Shuts down the SSL/TLS layer after authenticating.
The rest of the control channel communication will be unencrypted.
This allows NAT routers to follow the FTP transaction.
The default mode is passive.
--ftp-ssl-ccc-mode [active|passive] Use CCC (Clear Command Channel) Sets the CCC mode.
The passive mode will not initiate the shutdown, but waits for the server to do it, and will not reply to the shutdown from the server.
The active mode initiates the shutdown and waits for a reply from the server.
--ftp-ssl-control Require SSL/TLS for the FTP login, clear for transfer. Allows secure authentication, but non-encrypted data transfers for efficiency. Fails the transfer if the server doesn't support SSL/TLS.
--hostpubmd5 md5 SFTP / SCP. 32 hexadecimal digits the 128 bit MD5 checksum of the remote host's public key, Connection is refused unless md5sums match.
--tftp-blksize bytes Set TFTP BLKSIZE (must be >512), default 512 If this option is used several times, the last one will be used.
-k
--insecure
(SSL) explicitly allows "insecure" SSL connections and transfers.
All SSL connections are attempted to be made secure by using the CA certificate bundle installed by default. makes all connections considered "insecure" fail unless--insecure is used.
See this online resource for further details: http://curl.haxx.se/docs/sslcerts.html
-K
--config config file
Default curl.rc
Parameters must be specified on the same line, separated by whitespace, colon, equals sign or any combination (preferred equals ).
If the parameter contains whitespace, enclose it quotes.
Escape sequences are : \\, \", \t, \n, \r and \v. A backslash preceding any other letter is ignored.
If the first character is '#' , treated as a comment.
One option per line
Specify the filename as '-' to read from stdin.
To specify a URL in the config file use --url :
     url = "http://curl.haxx.se/docs/"

Long option names can be given without double dashes.
Unless -q is used, checks for a config file

1) "home dir": It first checks for the $CURL_HOME and then $HOME
It uses getpwuid() on UNIX-like systems (which returns the home dir given the current user ).
On Windows, it checks for APPDATA , or the %USERPROFILE%\Application Data'

2) On windows, if there is no _curlrc file in the home dir, checks for one in the same dir the curl executable is .
On UNIX-like systems, it will to load .curlrc from the determined home dir.

 # --- Example file --# this is a comment
       url = "curl.haxx.se"
       output = "curlhere.html"
       user-agent = "superagent/1.0"

       # and fetch another URL too
       url = "curl.haxx.se/docs/manpage.html"
       -O
       referer = "http://nowhereatall.com/" 
Can be used multiple times to load multiple config files.
-q as the first parameter on the command line, curlrc config file will not be read . See --config for details on the default config file search path.
--mail-from address (SMTP) single address that the mail should get sent from.
--mail-rcpt address (SMTP) single address that the mail should get sent to.
Use multiple times to specify many recipients.
--key key (SSL/SSH) Private key file name. If is used several times, the last one will be used.
--key-type type (SSL) Private key file type. type --key provided private key is. DER, PEM, and ENG .. If not specified, PEM is assumed.
If used several times, the last one will be used.
--keepalive-time seconds time a connection needs to remain idle before sending keepalive probes and the time between individual keepalive probes.
If used multiple times, the last occurrence sets the amount.
--limit-rate speed maximum transfer rate limit in bytes/second average during the entire transfer. Appending k or K will count kilobytes, m or M megabytes, g or G … Might use higher transfer speeds in short bursts
--speed-limit take precedence and might superess the rate-limiting , to help keeping the speed-limit logic working.
If used several times, the last one will be used.
-y
--speed-time time
Downloads slower than speed-limit bytes per second for speed-time gets aborted.
If speed-time is used, default speed-limit is 1.
If used several times, the last one will be used.
-Y
--speed-limit speed
Downloads slower than speed bytes per second for speed-time seconds it gets aborted.
speed-time is set with -y and is 30 if not set.

If used several times, the last one will be used.

--local-port num[-num] local port numbers to use. Port numbers are a scarce resource that will be busy so setting this range to something too narrow might cause unnecessary connection setup failures.
--max-filesize bytes Specify the maximum size (in bytes) of a file to download. If the file requested is larger than this value, the transfer will not start and curl will return with exit code 63.
FTP and HTTP transfers.
-N
--no-buffer
Disables the buffering of the output stream. use --buffer to enforce the buffering.
--no-keepalive Disables the use of keepalive messages on the TCP connection, default enabled use --keepalive to enforce keepalive.
--tcp-nodelay Turn on the TCP_NODELAY option. See the curl_easy_setopt(3) man page for details about this option.
--interface name Use a specified interface interface name, IP address or host name.
example curl --interface eth0:1 http://www.netscape.com/
If used several times, the last one will be used.
--no-sessionid (SSL) Disable use of SSL session-ID caching. By default all transfers are done using the cache. use --sessionid to enforce session-ID caching.
--noproxy host[,… list of hosts which do not use a proxy,
The * character disables the proxy.
Each host in this list is matched as either a domain which contains the hostname, or the hostname itself. For example, local.com would match local.com,E local.com:80, and www.local.com, but not www.notlocal.com.
-o
--output file
Write output to file instead of stdout.
With {} or [] to fetch multiple documents, use # followed by a number in the file specifier which will be replaced with the current string for the URL being fetched. For Example:
curl http://{one,two}.site.com -o "file_#1.txt"
or use several variables :
curl http://{site,host}.host[1-5].com -o "#1_#2"

Use as many times as the number of URLs
See --create-dirs to create the local directories dynamically.

Specifying - (a single dash) forces the output to stdout.

--create-dirs with -o creates the directory hierarchy as needed.
Creates the dirs mentioned with the -o option, nothing else.
If -o uses no dir or if the dirs it mentions already exist, no dir will be created.
To create remote directories when using FTP or SFTP, use code> --ftpcreate-dirs.
If used several times, the last one is used
-O
--remote-name
Write output to a local file named with the remote file name in the current working directory.
To save the file in a different directory change current working directory first.
Use as many times as the number of URLs
-P
--ftp-port address
(FTP) Use active (not passive) mode. tells the server to connect back to the client's specified address and port.
(passive mode asks the server to setup an IP address and port for it to connect to.)
address :
interface Ex: "eth0" to specify which interface's IP address to use (Unix only)
IP address Ex: "192.168.10.1" to specify the exact IP address
host name Ex: "my.host.domain" to specify the machine
- pick the same IP address that is already used for the control connection
If used several times, the last one will be used.
Disable the use of PORT with --ftp-pasv.
Disable the attempt to use the EPRT command instead of PORT by using --disable-eprt. EPRT is really PORT++.
Append :[start][-end] to the address, to specify the port range. A single number will fail if that port is not available.
--proto protocols use the listed protocols for its initial retrieval. Protocols are evaluated left to right, are comma separated, and are each a protocol name or 'all', optionally prefixed by zero or more modifiers. Available modifiers are:
+ Permit this protocol in addition to protocols already permitted (this is the default if no modifier is used).
- Deny this protocol, removing it from the list of protocols already permitted.
= Permit only this protocol (ignoring the list already permitted), though subject to later modification by subsequent entries in the comma separated list.
For example:
--proto -ftps uses the default protocols, but disables ftps
--proto -all,https,+http only enables http and https
--proto =http,https also only enables http and https

allows scripts to rely on being able to disable potentially dangerous protocols, without relying upon support for that protocol being built into curl to avoid an error.
This be used multiple times, concatenating the protocols into one instance

--proto-redir protocols use the listed protocols after a redirect. See --proto for how protocols are represented.
--proxy-anyauth pick a suitable authentication method when communicating with the given proxy. This might cause an extra request/response round-trip.
--proxy-basic use HTTP Basic authentication when communicating with the given proxy. Use --basic for enabling HTTP Basic with a remote host. Basic is the default authentication method curl uses with proxies.
--proxy-digest use HTTP Digest authentication when communicating with the given proxy. Use --digest for enabling HTTP Digest with a remote host.
--proxy-negotiate use HTTP Negotiate authentication when communicating with the given proxy. Use --negotiate for enabling HTTP Negotiate with a remote host.
--proxy-ntlm use HTTP NTLM authentication when communicating with the given proxy. Use --ntlm for enabling NTLM with a remote host.
--proxy1.0 proxyhost[:port] Use the specified HTTP 1.0 proxy. If the port number is not specified, it is assumed at port 1080. The only difference between this and the HTTP proxy --proxy is that attempts to use CONNECT through the proxy will specify an HTTP 1.0 protocol instead of the default HTTP 1.1.
--pass phrase (SSL/SSH) Passphrase for the private key
If is used several times, the last one will be used.
--pubkey key (SSH) Public key file name. Allows you to provide your public key in file.
If used several times, the last one will be used.
-Q
--quote command
(FTP/SFTP) Send an arbitrary command to the remote FTP or SFTP server. BEFORE the transfer takes place.
To make commands take place after a successful transfer, prefix them with a dash '-'.
To make commands be sent after libcurl has changed the working directory, just before the transfer command(s), prefix the command with a '+' (this is only supported for FTP).
may specify any number of commands. If the server returns failure for one of the commands, the entire operation will be aborted. You must send syntactically correct FTP commands as RFC 959 defines to FTP servers, or one of the commands listed below to SFTP servers.
Can be used multiple times. When speaking to a FTP server, prefix the command with an asterisk (*) to make libcurl continue even if the command fails as by default curl will stop at first failure.

SFTP is a binary protocol. Unlike for FTP, libcurl interprets SFTP quote commands before sending them to the server.
File names may be quoted shell-style to embed spaces or special characters. supported SFTP quote commands:

chgrp group file sets the group ID of the file named by the file operand to the group ID specified by the group operand. The group operand is a decimal integer group ID.
chmod mmm file modifies the file mode bits mmm is an octal integer mode number.
chown user file sets the owner to the user ID specified by the user operand.
ln source_file target_file ln and symlink create a symbolic link at the target_file location pointing to the source_file location.
mkdir directory_name creates the directory
pwd returns the absolute pathname of the current working directory.
rename source target renames the file or directory
rm file removes the file
rmdir directory removes the directory if empty.
-r
--range range
(HTTP/FTP/SFTP/FILE) Retrieve a byte range (ie a partial document)
       0-499   first 500 bytes 
       500-999  second 500 bytes 
       -500   last 500 bytes 
       9500-   bytes from offset 9500 and forward  
Request HTTP to reply with a multipart response.
0-0,-1 first and last byte only 500-700,600-799 300 bytes from offset 500 100-199,500-599 two separate 100-byte ranges
FTP and SFTP range downloads only support the start-stop syntax (optionally with one of the numbers omitted). FTP use depends on the extended FTP command SIZE.
If used several times, the last one will be used.
--random-file file (SSL) Specify the path name to file containing what will be considered as random data. The data is used to seed the random engine for SSL connections. See also the --egd-file
--raw disables all internal HTTP decoding of content or transfer encodings and instead makes them passed on unaltered, raw.
--remote-name-all changes the default action for all given URLs to be dealt with as if --remote-name were used for each one.
disable that for a specific URL after --remote-nameall has been used, use --no-remote-name.
--resolve host:port:address address for host instead of DNS resolved.
An /etc/hosts alternative .
port used for the specific protocol .
Several entries provide address for the same host but different ports.
Use multiple times to add host names.
--retry num For a transient error retry num times
default: 0 no retries.
Transient error : a timeout, an FTP 4xx response code or an HTTP 5xx response code.
When retrying wait one second and then doubles the waiting time up to 10 minutes.
--retry-delay disables this. See --retry-max-time to limit the total time allowed for retries.
If used multiple times, the last occurrenceis used.
--retry-delay seconds sleep before each retry when a transfer has failed with a transient error (changes the default backoff time algorithm between retries).
only if --retry is also used. zero :use the default backoff time.
If used multiple times, the last occurrence is used.
--retry-max-time seconds The retry timer is reset before the first transfer attempt. Retries will be done as usual as long as the timer hasn't reached this limit.
if the timer hasn't reached the limit, the request will be made and while performing, it may take longer than this given time period.
To limit a single request's maximum time, use --max-time.
Set to zero to not timeout retries.
If used multiple times, the last occurrence is used.
--ssl (FTP, POP3, IMAP, SMTP) Try to use SSL/TLS for the connection. Reverts to a non-secure connection if the server doesn't support SSL/TLS. See also --ftp-ssl-control and --ssl-reqd for different levels of encryption required. was --ftp-ssl
-1
--tlsv1
(SSL) use TLS version 1 when negotiating with a remote TLS server.
-2
--sslv2
(SSL) use SSL version 2
-3
--sslv3
(SSL) use SSL version 3
--ssl-reqd (FTP, POP3, IMAP, SMTP) Require SSL/TLS for the connection. Terminates the connection if the server doesn't support SSL/TLS.
formerly --ftp-ssl-reqd
--negotiate (HTTP) Enables GSS-Negotiate authentication. For use with Microsoft web applications. It is primarily meant as a support for Kerberos5 authentication but may be also used along with another authentication method. see IETF draft draft-brezakspnego-http-04.txt.

to enable Negotiate for your proxy authentication, use --proxy-negotiate.
This option requires a library built with GSSAPI support. is not very common. Use--version to see if GSS-Negotiate.s supported.

When using this option, provide a fake --user to activate the authentication code properly.
If is used several times, the first is used.

--socks5-gssapi-service servicename default rcmd/server-fqdn. Examples: --socks5 proxy-name --socks5-gssapi-service sockd use sockd/proxy-name --socks5 proxy-name --socks5-gssapiservice sockd/real-name would use sockd/real-name for cases where the proxy-name does not match the principal name.
--socks5-gssapi-nec As part of the gssapi negotiation a protection mode is negotiated. RFC 1961 says in section 4.3/4.4 it should be protected, but the NEC reference implementation does not. --socks5-gssapi-nec allows the unprotected exchange of the protection mode negotiation.
-t
--telnet-option OPT=val
Pass options to the telnet protocol. TTYPE=term terminal type.
XDISPLOC=X display X display location.
NEW_ENV=var,val environment variable.
--tlsauthtype authtype Set TLS authentication type. Currently, the only supported option is "SRP", for TLS-SRP (RFC 5054). If --tlsuser and --tlspassword are specified but --tlsauthtype is not, then this option defaults to "SRP". (Added in 7.21.4)
--tlsuser user Set username for use with the TLS authentication method specified with --tlsauthtype. Requires that --tlspassword also be set.
--tlspassword password Set password for use with the TLS authentication method specified with --tlsauthtype. Requires that --tlsuser also be set.
--environment (RISC OS ONLY) Sets a range of environment variables, using the names -w supports, to allow easier extraction of useful information after having run curl.
--trace file Enables a full trace dump of all incoming and outgoing data, including descriptive information, to the given output file. Use "-" as filename to have the output sent to stdout. This option overrides previous uses of or --traceascii. If used several times, the last one will be used.
--trace-ascii file Enables a full trace dump of all incoming and outgoing data, including descriptive information, to the given output file. Use "-" as filename to have the output sent to stdout. similar to --trace, leaves out the hex part and only shows the ASCII part of the dump. It makes smaller output that might be easier to read for untrained humans. overrides previous uses of --verbose or --trace.
If used several times, the last one will be used.
--trace-time Prepends a time stamp to each trace or verbose line
-u
--user user:password
user name and password to use for authentication.
Overrides --netrc and --netrc-optional.
Force user name and password by specifying a single colon with : --user :.
If used several times, the last one will be used.
-U
--proxy-user user:password
user name and password to use for proxy authentication. If this option is used several times, the last one will be used.
-v
--verbose
A line starting with >' means "header data" sent , < means "header data" received by curl that is hidden in normal cases, and a line starting with '*' means additional info provided by curl.
For only HTTP headers use --include
For more details, use --trace or --trace-asci
overrides previous uses of --trace-ascii or --trace.
Use--silent to make curl quiet.
* TCP_NODELAY set
* Connected to real-world-systems.com (209.95.59.175) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/cert.pem
  CApath: none
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
} [236 bytes data]
* TLSv1.2 (IN), TLS handshake, Server hello (2):
{ [108 bytes data]
* TLSv1.2 (IN), TLS handshake, Certificate (11):
{ [4489 bytes data]
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
{ [300 bytes data]
* TLSv1.2 (IN), TLS handshake, Server finished (14):
{ [4 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
} [37 bytes data]
* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.2 (OUT), TLS handshake, Finished (20):
} [16 bytes data]
* TLSv1.2 (IN), TLS change cipher, Change cipher spec (1):
{ [1 bytes data]
* TLSv1.2 (IN), TLS handshake, Finished (20):
{ [16 bytes data]
* SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384
* ALPN, server accepted to use http/1.1
* Server certificate:
*  subject: CN=real-world-systems.com
*  start date: Apr  3 00:00:00 2020 GMT
*  expire date: Apr 10 23:59:59 2021 GMT
*  subjectAltName: host "real-world-systems.com" matched cert's "real-world-systems.com"
*  issuer: C=GB; ST=Greater Manchester; L=Salford; O=Sectigo Limited; CN=Sectigo RSA Domain Validation Secure Server CA
*  SSL certificate verify ok.
> GET / HTTP/1.1
> Host: real-world-systems.com
> User-Agent: curl/7.64.1
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 10 Aug 2020 20:39:07 GMT
< Server: Apache
< Last-Modified: Sun, 12 Jul 2020 18:58:37 GMT
< Accept-Ranges: bytes
< Content-Length: 2975
< Content-Type: text/html
< 
{ [2975 bytes data]
100  2975  100  2975    0     0   9596      0 --:--:-- --:--:-- --:--:--  9596
* Connection #0 to host real-world-systems.com left intact
* Closing connection 0
-w
--write-out format
to display on stdout after a completed and successful operation.
a string that may contain plain text mixed with any number of variables.
The string can be specified as "string", to get read from a particular file you specify it "@filename" and to tell curl to read the format from stdin use "@-".

The variables present in the output format will be substituted by the value or text that curl thinks fit, as described below. All variables are specified as %{variable_name} and to output a normal % you just write them as %%. You can output a newline by using \n, a carriage return with \r and a tab space with \t. The %-symbol is a special symbol in the win32-environment, where all occurrences of % must

url_effective URL that was fetched last. meaningful if curl is following location: headers.
http_code numerical response code in the last retrieved HTTP(S) or FTP(s) transfer.
http_connect numerical code in the last response (from a proxy) to a curl CONNECT request.
time_total seconds, the full operation lasted. with millisecond resolution.
time_namelookup seconds, until the name resolving was completed.
time_connect seconds, until the TCP connect to the remote host (or proxy) was completed.
time_appconnect seconds, until the SSL/SSH/etc connect/handshake to the remote host was completed.
time_pretransfer seconds, from the start until the file transfer was just about to begin. includes pre-transfer commands and negotiations specific to the particular protocol(s) involved.
time_redirect seconds, it took for all redirection steps include name lookup, connect, pretransfer and transfer before the final transaction was started. time_redirect shows the complete execution time for multiple redirections. (Added in time_starttransfer The time, in seconds, it took from the start until the first byte was just about to be transferred. This includes time_pretransfer and also the time the server needed to calculate the result.
size_download bytes that were downloaded.
size_upload
size_header
size_request bytes that were sent in the HTTP request.
speed_download average
speed_upload average content_type of the requested document,
num_connects Number of new connects
num_redirects Number of redirects that were followed
redirect_url a HTTP request without-L to follow redirects, actual URL a redirt result
ftp_entry_path initial path libcurl ended up in when logging on to the remote FTP server.
ssl_verify_result of the SSL peer certificate verification, 0 means the verification was successful.
If used several times, the last one will be used.
-x
--proxy [protocol://][user@password]proxyhost[:port]
Use the specified HTTP proxy. If the port number is not specified, it is assumed at port 1080. overrides existing environment variables that set the proxy to use. If there's an environment variable setting a proxy, you can set proxy to "" to override it. All operations that are performed over a HTTP proxy will transparently be converted to HTTP. It means that certain protocol specific operations might not be available. This is not the case if you can tunnel through the proxy, as one with the -p
--proxytunnel
The proxy host can be specified the exact same way as the proxy environment variables, including the protocol prefix (http://) and the embedded user + password. the proxy string may be specified with a protocol:// prefix to specify alternative proxy protocols. Use socks4://, socks4a://, socks5:// or socks5h:// to request the specific SOCKS version to be used. No protocol specified, http:// and all others will be treated as HTTP proxies. If used several times, the last one will be used.
-p
--proxytunnel
With (-x
--proxy), causes non-HTTP protocols to to tunnel through the proxy instead of using it to do HTTP-like operations. The tunnel is made with the HTTP proxy CONNECT request and requires that the proxy allows direct connect to the remote port number curl wants to tunnel through to.
-X
--request command
(HTTP) Specifies a custom request method to use when communicating with the HTTP server.
The specified request will be used instead of the method otherwise used (which defaults to GET). Common additional HTTP requests include PUT and DELETE, but related technologies like WebDAV offers PROPFIND, COPY, MOVE and more.
(FTP) Specifies a custom FTP command to use instead of LIST when doing file lists with FTP.
If used several times, the last one will be used.
--xattr When saving output to a file, stores file metadata in extened file attributes. the URL is stored in the xdg.origin.url attribute and, for HTTP, the content type is stored in the mime_type attribute.
-z
--time-cond date expression
(HTTP/FTP/FILE) Request a file that has been modified later than the given time and date, or one that has been modified before that time. date expression can be various formats if it doesn't match any internal ones, it uses time from a given file name instead!
See curl_getdate(3) for date expression details.
Start the date expression with a dash (-) to make it request for a document that is older than the given date/time,
default is a document that is newer than the specified date/time.
If used several times, the last one will be used.
--libcurl file Append to command line, and get a libcurl-using source code written to the file that does the equivalent of what your command-line operation does!

Does not support -F and the sending of multipart formposts,
If used several times, the last given file name will be used.

-h
--help
-M
--manual
Manual. Display the huge help text.
-0
--http1.0
(HTTP) issue its requests using HTTP 1.0
-4
--ipv4
resolve names to IPv4 addresses only.
-6
--ipv6
resolve names to IPv6 addresses only. default statistics.
-V
--version
includes the full version of curl, libcurl and 
             other 3rd party libraries linked with the executable.
Protocols: shows all protocols that libcurl reports to support.
Features:shows specific features libcurl reports to offer. 
Features include:

IPv6  You can use IPv6 with this.
krb4  Krb4 for FTP is supported.
SSL  HTTPS and FTPS are supported.
libz  Automatic decompression of compressed files over HTTP is supported.
NTLM  NTLM authentication is supported.
GSS-Negotiate Negotiate authentication and krb5 for FTP is supported.
Debug use a libcurl built with Debug. enables more error-tracking and memory debugging etc. 
    For curldevelopers only!
AsynchDNS uses asynchronous name resolves.
SPNEGO SPNEGO Negotiate authentication is supported.
Largefile supports transfers of large files, files larger than 2GB.
IDN  supports IDN - international domain names.
SSPI  SSPI is supported. with NTLM and set a blank user name, curl 
         will authenticate with current user and password.
TLS-SRP SRP (Secure Remote Password) authentication is supported for TLS.

FILES

~/.curlrc Default config file, see --config for details.

ENVIRONMENT

The environment variables can be specified in lower case or upper case. The lower case version has precedence. http_proxy is an exception as it is only available in lower case. Using an environment variable to set the proxy has the same effect as using the --proxy option. http_proxy [protocol://]host[:port] Sets the proxy server to use for HTTP. HTTPS_PROXY [protocol://]host[:port] Sets the proxy server to use for HTTPS. [url-protocol]_PROXY [protocol://]host[:port] Sets the proxy server to use for [url-protocol], where the protocol is a protocol that curl supports and as specified in a URL. FTP, FTPS, POP3, IMAP, SMTP, LDAP etc. ALL_PROXY [protocol://]host[:port] Sets the proxy server to use if no protocol-specific proxy is set. NO_PROXY comma-separated list of hosts list of host names that shouldn't go through any proxy. If set to a asterisk '*' only, it matches all hosts.

PROXY PROTOCOL PREFIXES

proxy string may be specified with a protocol:// prefix to specify alternative proxy protocols.

If no protocol is specified in the proxy string or if the string doesn't match a supported one, the proxy will be treated as a HTTP proxy. The supported proxy protocol prefixes are :

socks4:// --socks4
socks4a:// --socks4a
socks5:// --socks5
socks5h:// --socks5-hostname

EXIT CODES


    1   Unsupported protocol. This build of curl has no support for this protocol. 
    2   Failed to initialize. 
    3   URL malformed. The syntax was not correct. 
    4   A feature or option that was needed to perform the desired request was not enabled or 
        was explicitly disabled at buildtime. To make curl able to do this, you probably need another build of libcurl! 
    5   Couldn't resolve proxy. The given proxy host could not be resolved. 
    6   Couldn't resolve host. The given remote host was not resolved. 
    7   Failed to connect to host. 
    8   FTP weird server reply. The server sent data curl couldn't parse. 
    9   FTP access denied. The server denied login or denied access to the particular resource or directory you wanted to reach.
               Most often you tried to change to a directory that doesn't exist on the server. 
    11   FTP weird PASS reply. Curl couldn't parse the reply sent to the PASS request. 
    13   FTP weird PASV reply, Curl couldn't parse the reply sent to the PASV request. 
    14   FTP weird 227 format. Curl couldn't parse the 227-line the server sent. 
    15   FTP can't get host. Couldn't resolve the host IP we got in the 227-line. 
    17   FTP couldn't set binary. Couldn't change transfer method to binary. 
    18   Partial file. Only a part of the file was transferred. 
    19   FTP couldn't download/access the given file, the RETR (or similar) command failed. 
    21   FTP quote error. A quote command returned error from the server.
    22   HTTP page not retrieved. The requested url was not found or returned another error with the HTTP error code 
              being 400 or above. This return code only appears if--fail is used. 
    23   Write error. Curl couldn't write data to a local filesystem or similar. 
    25   FTP couldn't STOR file. The server denied the STOR operation, used for FTP uploading. 
    26   Read error. Various reading problems. 
    27   Out of memory. A memory allocation request failed. 
    28   Operation timeout. The specified time-out period was reached according to the conditions. 
    30   FTP PORT failed. The PORT command failed. Not all FTP servers support the PORT command, 
          try doing a transfer using PASV instead! 
    31   FTP couldn't use REST. The REST command failed. This command is used for resumed FTP transfers. 
    33   HTTP range error. The range "command" didn't work. 
    34   HTTP post error. Internal post-request generation error. 
    35   SSL connect error. The SSL handshaking failed. 
    36   FTP bad download resume. Couldn't continue an earlier aborted download. 
    37   FILE couldn't read file. Failed to open the file. Permissions? 
    38   LDAP cannot bind. LDAP bind operation failed. 
    39   LDAP search failed. 
    41   Function not found. A required LDAP function was not found. 
    42   Aborted by callback. An application told curl to abort the operation. 
    43   Internal error. A function was called with a bad parameter. 
    45   Interface error. A specified outgoing interface could not be used. 
    47   Too many redirects. When following redirects, curl hit the maximum amount. 
    48   Unknown option specified to libcurl. 
         This indicates that you passed a weird option to curl that was passed on to libcurl and rejected. Read up in the manual! 
    49   Malformed telnet option. 
    51   The peer's SSL certificate or SSH MD5 fingerprint was not OK. 
    52   The server didn't reply anything, which here is considered an error. 
    53   SSL crypto engine not found. 
    54   Cannot set SSL crypto engine as default. 
    55   Failed sending network data. 
    56   Failure in receiving network data. 
    58   Problem with the local certificate. 
    59   Couldn't use specified SSL cipher. 
    60   Peer certificate cannot be authenticated with known CA certificates. 
    61   Unrecognized transfer encoding. 
    62   Invalid LDAP URL. 
    63   Maximum file size exceeded. 
    64   Requested FTP SSL level failed. 
    65   Sending the data requires a rewind that failed. 
    66   Failed to initialise SSL Engine. 
    67   The user name, password, or similar was not accepted and curl failed to log in. 
    68   File not found on TFTP server. 
    69   Permission problem on TFTP server. 
    70   Out of disk space on TFTP server. 
    71   Illegal TFTP operation. 
    72   Unknown TFTP transfer ID. 
    73   File already exists (TFTP). 
    74   No such user (TFTP). 
    75   Character conversion failed. 
    76   Character conversion functions required. 
    77   Problem with reading the SSL CA cert (path? access rights?). 
    78   The resource referenced in the URL does not exist. 
    79   An unspecified error occurred during the SSH session. 
    80   Failed to shut down the SSL connection. 
    82   Could not load CRL file, missing or wrong format (added in 7.19.0). 
    83   Issuer check failed (added in 7.19.0). 
    84   The FTP PRET command failed 
    85   RTSP: mismatch of CSeq numbers 
    86   RTSP: mismatch of Session Identifiers 
    87   unable to parse FTP file list 
    88   FTP chunk callback reported error 
SEE ALSO ftp(1), wget(1) LATEST VERSION You always find news about what's going on as well as the latest versions from the curl web pages, located at: http://curl.haxx.se SIMPLE USAGE Get the main page from Netscape's web-server: curl http://www.netscape.com/

Get the README file the user's home directory at funet's ftp-server: curl ftp://ftp.funet.fi/README

Get a web page from a server using port 8000: curl http://www.weirdserver.com:8000/

Get a list of a directory of an FTP site: curl ftp://cool.haxx.se/

Get the definition of curl from a dictionary: curl dict://dict.org/m:curl

Fetch two documents at once: curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/

Get a file off an FTPS server: curl ftps://files.are.secure.com/secrets.txt

or use the more appropriate FTPS way to get the same file: curl --ftp-ssl ftp://files.are.secure.com/secrets.txt

Get a file from an SSH server using SFTP: curl -u username sftp://shell.example.com/etc/issue

Get a file from an SSH server using SCP using a private key to authenticate: curl -u username: --key ~/.ssh/id_dsa --pubkey ~/.ssh/id_dsa.pub \ scp://shell.example.com/~/personal.txt

Get the main page from an IPv6 web server: curl -g "http://[2001:1890:1112:1::20]/"

DOWNLOAD TO A FILE Get a web page and store in a local file: curl -o thatpage.html http://www.netscape.com/

Get a web page and store in a local file, make the local file get the name of the remote document (if no file name part is specified in the URL, this will fail): curl -O http://www.netscape.com/index.html

Fetch two files and store them with their remote names: curl -O www.haxx.se/index.html -O curl.haxx.se/download.html

USING PASSWORDS

FTP

To ftp files using name+passwd, include them in the URL like: curl ftp://name:passwd@machine.domain:port/full/path/to/file or specify them with -u like curl -u name:passwd ftp://machine.domain:port/full/path/to/file

FTPS

like FTP, but you may also want to specify and use SSL-specific options for certificates etc. using FTPS:// as prefix is the "implicit" way as described in the standards while the recommended "explicit" way is done by using FTP:// and --ftp-ssl .

SFTP / SCP

This is similar to FTP, but you can specify a private key to use instead of a password. Note that the private key may itself be protected by a password that is unrelated to the login password of the remote system. If you provide a private key file you must also provide a public key file.

HTTP

Curl also supports user and password in HTTP URLs, thus you can pick a file like: curl http://name:passwd@machine.domain/full/path/to/file or specify user and password separately like in curl -u name:passwd http://machine.domain/full/path/to/file HTTP offers many different methods of authentication and curl supports several: Basic, Digest, NTLM and Negotiate. Without telling which method to use, curl defaults to Basic. You can also ask curl to pick the most secure ones out of the ones that the server accepts for the given URL, by using --anyauth. Since HTTP URLs don't support user and password, you can't use that style when using Curl via a proxy. You _must_ use the -u fetch during such circumstances.

HTTPS

Probably most commonly used with private certificates, as explained below.

PROXY

curl supports both HTTP and SOCKS proxy servers, with optional authentication. It does not have special support for FTP proxy servers since there are no standards for those, but it can still be made to work with many of them. You can also use both HTTP and SOCKS proxies to transfer files to and from FTP servers. Get an ftp file using an HTTP proxy named my-proxy that uses port 888: curl -x my-proxy:888 ftp://ftp.leachsite.com/README Get a file from a HTTP server that requires user and password, using the same proxy as above: curl -u user:passwd -x my-proxy:888 http://www.get.this/

Some proxies require special authentication. Specify by using -U as above: curl -U user:passwd -x my-proxy:888 http://www.get.this/

A comma-separated list of hosts and domains which do not use the proxy can be specified as: curl --noproxy localhost,get.this -x my-proxy:888 http://www.get.this/

If the proxy is specified with --proxy1.0 instead of --proxy or -x, then curl will use HTTP/1.0 instead of HTTP/1.1 for any CONNECT attempts.

curl also supports SOCKS4 and SOCKS5 proxies with --socks4 and --socks5.

See the environment variables that offer further proxy control.

Most FTP proxy servers are set up to appear as a normal FTP server from the client's perspective, with special commands to select the remote FTP server. curl supports the -u, -Q and --ftp-account options that can be used to set up transfers through many FTP proxies. For example, a file can be uploaded to a remote FTP server using a Blue Coat FTP proxy with the options:


  curl -u "Remote-FTP-Username@remote.ftp.server Proxy-Username:Remote-Pass" \
  --ftp-account Proxy-Password --upload-file local-file \
  ftp://my-ftp.proxy.server:21/remote/upload/path/

See the manual for your FTP proxy to determine the form it expects to set up transfers, and curl's -v option to see exactly what curl is sending.

RANGES

With HTTP 1.1 byte-ranges were introduced. Using this, a client can request to get only one or more subparts of a specified document. Curl supports this with the -r

Get the first 100 bytes of a document: curl -r 0-99 http://www.get.this/

Get the last 500 bytes of a document: curle> -r -500 http://www.get.this/

Curl also supports simple ranges for FTP files as well. Then you can only specify start and stop position. Get the first 100 bytes of a document using FTP: curl -r 0-99 ftp://www.get.this/README

UPLOADING

FTP / FTPS / SFTP / SCP

Upload all data on stdin to a specified server: curl -T - ftp://ftp.upload.com/myfile

Upload data from a specified file, login with user and password: curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile

Upload a local file to the remote site, and use the local file name remote too: curl -T uploadfile -u user:passwd ftp://ftp.upload.com/

Upload a local file to get appended to the remote file: curl -T localfile -a ftp://ftp.upload.com/remotefile

ftp upload through a proxy, if configured to allow that kind of tunneling. : curl --proxytunnel -x proxy:port -T localfile ftp.upload.com

HTTP

Upload all data on stdin to a specified http site: -T - http://www.upload.com/myfile

the http server must have been configured to accept PUT before this can be done successfully.

For other ways to do http data upload, see the POST section below.

Verbose / Debug

the -v to get verbose fetching. : curl -v ftp://ftp.upload.com/

more details and information use --trace or --trace-ascii with a file name to log to, : curl --trace trace.txt www.haxx.se

Detailed information

Different protocols provide different ways of getting detailed information about specific files/documents.
To show detailed information about a single file, use --head which displays all available info on a single file for HTTP and FTP. The HTTP information is a lot more extensive.

For HTTP, get the header information (as with --head ) before the data use --include.
--dump-header when getting files from both FTP and HTTP, and it will then store the headers in the specified file.

Store the HTTP headers in a separate file (headers.txt in the example): curl --dump-header headers.txt curl.haxx.se

POST (HTTP)

using the -d data . The post data must be urlencoded.

Post a simple "name" and "phone" guestbook. curl -d "name=Rafael%20Sagula&phone=3320780" \ http://www.where.com/guest.cgi

Example #1:
Extract the input tags in the form (see perl program formfind.pl on the curl site ).

If there's a "normal" post, use -d to post. in the format : variable1=data1&variable2=data2&...

The 'variable' names are the names set with "name=" in the <input> tags, and the data is the contents to fill for the inputs. The data must be properly URL encoded, replace space with + and encode special characers letters with %XX where XX is the hexadecimal representation of the letter's ASCII code.

Example: page located at formpost.com/testpost/

 <form action="post.cgi" method="post">
    <input name=user >
    <input name=pass type=password >
    <input name=id type=hidden value="blablabla">
    <input name=ding value="submit">
    </form> 

To post to this: curl --data "user=me&pass=12345&id=myid&ding=submit" |http://formpost.com/testpost/post.cgi

--data uses the application/x-www-form-urlencoded mime-type, generally understood by CGI's and similar, curl also supports the more capable multipart/form-data type which supports file uploads, etc.. -F accepts parameters like -F "name=contents". To read from a file, use @filename as contents. When specifying a file content type append ;type=mime type to the file name.
To post the contents of several files in one field. For example, the field name 'coolfiles' is used to send three files, with different content types using : curl -F "coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html" \ http://www.post.com/postit.cgi

If the content-type is not specified, curl will try to guess from the file extension (it only knows a few), or use the previously specified type (from an earlier file if several files are specified in a list) or else it will use the default type 'application/octet-stream'.

Emulate a fill-in form with -F. To fill three fields in a form. One field is a file name which to post, one field is your name and one field is a file description. We want to post the file we have written named "cooltext.txt". To let curl do the posting of this data instead of your favourite browser, you have to read the HTML source of the form page and find the names of the input fields. In our example, the input field names are 'file', 'yourname' and 'filedescription'. curl -F "file=@cooltext.txt" -F "yourname=Daniel" \
-F "filedescription=Cool text file with cool text inside" \
http://www.post.com/postit.cgi

To send two files in one post :

  • Send multiple files in a single "field" with a single field name: curl -F "pictures=@dog.gif,cat.gif"
  • Send two fields with two field names: curl -F "docpicture=@dog.gif" -F "catpicture=@cat.gif" To send a field value literally without interpreting a leading '@', '<', or an embedded ';type=', use --form-string >

    Referrer

    A HTTP request can include which address referred to page.
    curl -e www.coolsite.com http://www.showme.com/
    The Referer: [sic] field is a full URL.

    User agent

    A HTTP request can report the browser that generated the request.
    curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/

    Other common strings:

      'Mozilla/3.0 (Win95; I)'   Netscape Version 3 for Windows 95
      'Mozilla/3.04 (Win95; U)'  Netscape Version 3 for Windows 95
      'Mozilla/2.02 (OS/2; U)'   Netscape Version 2 for OS/2
      'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)'      NS for AIX
      'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)'   NS for Linux 
      'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)'  MSIE for W95 
      'Konqueror/1.0'       KDE File Manager desktop client
      'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser 
    

    Cookies

    used to keep state information at the client's side. The server sets cookies by sending a response line in the headers that like Set-Cookie: data where the data contains a set of NAME=VALUE pairs (separated by semicolons ';' like "NAME1=VALUE1; NAME2=VALUE2;").
    The server can specify the path the "cookie" should be used for (by specifying "path=value"), when the cookie expires ("expire=DATE"), for what domain to use it ("domain=NAME") and if it should be used on secure connections only ("secure").

    If you've received a page from a server that contains a header like: Set-Cookie: sessionid=boo123; path="/foo";

    means the server wants that first pair passed on when we get anything in a path beginning with "/foo". Example, get a page that wants my name passed in a cookie: curl -b "name=Daniel" www.sillypage.com

    to use previously received cookies in following sessions. store them in a file in a manner similar to: curl --dump-header headers www.example.com
    ... then in another connect to that (or another) site, use the cookies from the 'headers' file like: curl -b headers www.example.com

    saving headers to a file is error-prone and not the preferred
    save the incoming cookies using the well-known netscape cookie format : curl -c cookies.txt www.example.com

    -b enables "cookie awareness" and with -L you can make curl follow a location: (which often is used in combination with cookies). So that if a site sends cookies and a location, you can use a non-existing file to trigger the cookie awareness like: curl -L -b empty.txt www.example.com

    The file to read cookies from must be formatted using plain HTTP headers OR as netscape's cookie file. Curl will determine what kind it is based on the file contents. In the above command, curl will parse the header and store the cookies received from www.example.com. curl will send to the server the stored cookies which match the request as it follows the location. The file "empty.txt" may be a nonexistent file.

    to both read and write cookies from a netscape cookie file, set both -b and -c to use the same file: curl -b cookies.txt -c cookies.txt www.example.com

    PROGRESS METER

     % Total  % Received % Xferd Average Speed     Time       Curr.
                     Dload Upload Total  Current Left  Speed
     0 151M  0 38608  0   0  9406   0 4:41:43 0:00:04 4:41:39 9287
    
      %       - of the whole transfer
      Total     - size of the expected transfer
      Received   - downloaded bytes
      Xferd     - uploaded bytes
      Time Total  - expected 
      Time Current - since started
      Curr.Speed  - average speed over the last 5 seconds 
    
    -# displays a # for every 1,000 characters received.

    SPEED LIMIT

    -y and -Y abort transfers if the transfer speed is below the lowest limit for a specified time. To have curl abort the download if the speed is slower than 3000 bytes per second for 1 minute, run: curl -Y 3000 -y 60 www.far-away-site.com

    This can very well be used in combination with the overall time limit, so that the above operation must be completed in whole within 30 minutes: curl -m 1800 -Y 3000 -y 60 www.far-away-site.com

    Forcing curl not to transfer data faster than a given rate is also possible, which might be useful if you're using a limited bandwidth connection and you don't want your transfer to use all of it (sometimes referred to as "bandwidth throttle"). Make curl transfer data no faster than 10 kilobytes per second: curl --limit-rate 10K www.far-away-site.com

    or curl --limit-rate 10240 www.far-away-site.com

    Or prevent curl from uploading data faster than 1 megabyte per second: curl -T upload --limit-rate 1M ftp://uploadshereplease.com

    When using the --limit-rate option, the transfer rate is regulated on a per-second basis, which will cause the total transfer speed to become lower than the given number. Sometimes of course substantially lower, if your transfer stalls during periods.

    CONFIG FILE

    The .curlrc (or _curlrc on win32 systems) from the user's home dir on startup unless -q supress reading that or --config specifies an alternate.

    The config file could be made up with command line switches, or long options without dashes. separate with spaces, = or :. # begin a comment parameter with spaces, must enclose the entire parameter within quotes ("). Within quotes, include a quote as \".
    Options and arguments on the same line must ne on the same line.
    Example, set default time out and proxy in a config file:

     # 30 minute timeout:
     -m 1800
     # proxy for all accesses:
     proxy = proxy.our.domain.com:8080 
    White spaces ARE significant at the end of lines, but white spaces up to the first character of each line are ignored.
    To display a local help page when invoked without URL by making a config file similar to:
    # default url to get
              url = "http://help.with.curl.com/curlhelp.html"

    example: echo "user = user:passwd" | curl -K - http://that.secret.site.com

    Extra headers

    When using curl in your programs, you may end up needing to pass on your own custom headers when getting a web page. You can do this by using the -H flag. Example, send the header "X-you-and-me: yes" to the server when getting a page: curl -H "X-you-and-me: yes" www.love.com

    useful in case to send a different text in a header than normal. The -H header replaces the header normally send. If you replace an internal header with an empty one, you prevent that header from being sent. To prevent the Host header from being used: curl -H "Host:" www.server.com

    FTP and PATH names

    when getting files with the ftp:// URL, the given path is relative the directory . To get the file 'README' from your home directory at your ftp site, do: curl ftp://user:passwd@my.site.com/README

    want the README file from the root directory of that site, specify the absolute file name: curl ftp://user:passwd@my.site.com//README

    (I.e with an slash in front of the file name.)

    SFTP and SCP and PATH NAMES

    With sftp: and scp: URLs, the path name given is the absolute name on the server. To access a file relative to the remote user's home directory, prefix the file with /~/ , such as: curl -u $USER sftp://home.example.com/~/.bashrc

    FTP and firewalls

    The FTP protocol requires one of the involved parties to open a second connection as soon as data is about to get transfered.

    The default issues the PASV command which causes the server to open another port and await another connection performed by the client. This is good if the client is behind a firewall that don't allow incoming connections. curl ftp.download.com

    If the server is behind a firewall that don't allow connections on other ports than 21 (or if it just doesn't support the PASV command), the other way to do it is to use the PORT command and instruct the server to connect to the client on the given (as parameters to the PORT command) IP number and port.

    -P Your machine may have several IP-addresses and/or network interfaces to select the on to use. Default address can also be used: curl -P - ftp.download.com

    Download with PORT but use the IP address of 'le0' interface (not on windows): curl -P le0 ftp.download.com

    Download with PORT but use 192.168.0.10 as our IP address : curl -P 192.168.0.10 ftp.download.com

    NETWORK INTERFACE

    Get a web page from a server using a specified port for the interface: curl --interface eth0:1 http://www.netscape.com/

    or

    curl --interface 192.168.1.10 http://www.netscape.com/

    HTTPS

    Secure HTTP requires SSL libraries to be installed and used when curl is built. If that is done, curl is capable of retrieving and posting documents using the HTTPS protocol. Example: curl https://www.secure-site.com

    Curl is also capable of using your personal certificates to get/post files from sites that require valid certificates. The only drawback is that the certificate needs to be in PEM-format. PEM is a standard and open format to store certificates with, but it is not used by the most commonly used browsers (Netscape and MSIE both use the so called PKCS#12 format). If you want curl to use the certificates you use with your (favourite) browser, you may need to download/compile a converter that can convert your browser's formatted certificates to PEM formatted ones. This kind of converter is included in recent versions of OpenSSL, and for older versions Dr Stephen N. Henson has written a patch for SSLeay that adds this functionality. You can get his patch (that requires an SSLeay installation) from his site at: http://www.drh-consultancy.demon.co.uk/ Example on how to automatically retrieve a document using a certificate with a personal password: curl -E /path/to/cert.pem:password https://secure.site.com/

    If you neglect to specify the password on the command line, you will be prompted for the correct password before any data can be received. Many older SSL-servers have problems with SSLv3 or TLS, that newer versions of OpenSSL etc is using, therefore it is sometimes useful to specify what SSL-version curl should use. Use -3, -2 or -1 to specify that exact SSL version to use (for SSLv3, SSLv2 or TLSv1 respectively): curl -2 https://secure.site.com/

    Otherwise, curl will first attempt to use v3 and then v2. To use OpenSSL to convert your favourite browser's certificate into a PEM formatted one that curl can use, do something like this (assuming netscape, but IE is likely to work similarly): You start with hitting the 'security' menu button in netscape. Select 'certificates-yours' and then pick a certificate in the list Press the 'export' button enter your PIN code for the certs select a proper place to save it Run the 'openssl' application to convert the certificate. If you cd to the openssl installation, you can do it like: # ./apps/openssl pkcs12 -in [file you saved] -clcerts -out [PEMfile]

    RESUMING FILE TRANSFERS

    To continue a file transfer where it was previously aborted, curl supports resume on http(s) downloads as well as ftp uploads and downloads. Continue downloading a document: curl -C - -o file ftp://ftp.server.com/path/file

    Continue uploading a document(*1): curl -C - -T file ftp://ftp.server.com/path/file

    Continue downloading a document from a web server(*2): curl -C - -o file http://www.server.com/ (*1) = requires the ftp server supports the non-standard command SIZE. If it doesn't, curl will say so. (*2) = requires the web server supports at least HTTP/1.1. If it doesn't, curl will say so.

    TIME CONDITIONS

    HTTP allows a client to specify a time condition for the document it requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to specify them with the -z/--time-cond flag.
    For example, you can easily make a download that only gets performed if the remote file is newer than a local copy. It would be made like: curl -z local.html http://remote.server.com/remote.html

    Or you can download a file only if the local file is newer than the remote one. Do this by prepending the date string with a '-', as in: curl -z -local.html http://remote.server.com/remote.html

    You can specify a "free text" date as condition. Tell curl to only download the file if it was updated since January 12, 2012: curl -z "Jan 12 2012" http://remote.server.com/remote.html

    Curl will then accept a wide range of date formats. You always make the date check the other way around by prepending it with a dash '-'. DICT For fun try

    
        curl dict://dict.org/m:curl
        curl dict://dict.org/d:heisenbug:jargon
        curl dict://dict.org/d:daniel:web1913
    
     Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'
     and 'lookup'. For example,
    
        curl dict://dict.org/find:curl
    
     Commands that break the URL description of the RFC (but not the DICT
     protocol) are
    
        curl dict://dict.org/show:db
        curl dict://dict.org/show:strat
    
    Authentication is still missing (but this is not required by the RFC) LDAP If you have installed the OpenLDAP library, curl can take advantage of it and offer ldap:// support. LDAP is a complex thing and writing an LDAP query is not an easy task. I do advice you to dig up the syntax description for that elsewhere. Two places that might suit you are: Netscape's "Netscape Directory SDK 3.0 for C Programmer's Guide Chapter 10: Working with LDAP URLs": http://developer.netscape.com/docs/manuals/dirsdk/csdk30/url.htm

    RFC 2255, "The LDAP URL Format" http://curl.haxx.se/rfc/rfc2255.txt To show you an example, this is now I can get all people from my local LDAP server that has a certain sub-domain in their email address: curl -B "ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se"

    If I want the same info in HTML format, I can get it by not using the -B (enforce ASCII) flag.

    ENVIRONMENT VARIABLES

    Curl reads and understands the following environment variables: http_proxy, HTTPS_PROXY, FTP_PROXY

    They should be set for protocol-specific proxies. General proxy should be set with ALL_PROXY

    A comma-separated list of host names that shouldn't go through any proxy is set in (only an asterisk, '*' matches all hosts) NO_PROXY

    If the host name matches one of these strings, or the host is within the domain of one of these strings, transactions with that node will not be proxied. The usage of the -x/--proxy flag overrides the environment variables.

    NETRC

    Unix introduced the .netrc concept a long time ago. It is a way for a user to specify name and password for commonly visited ftp sites in a file so that you don't have to type them in each time you visit those sites. You realize this is a big security risk if someone else gets hold of your passwords, so therefore most unix programs won't read this file unless it is only readable by yourself (curl doesn't care though).

    ` Curl supports .netrc files if told so (using the -n/--netrc and --netrc-optional options). This is not restricted to only ftp, but curl can use it for all protocols where authentication is used. A very simple .netrc file could look something like: machine curl.haxx.se login iamdaniel password mysecret

    CUSTOM OUTPUT

    To better allow script programmers to get to know about the progress of curl, the -w/--write-out option was introduced. Using this, you can specify what information from the previous transfer you want to extract. To display the amount of bytes downloaded together with some text and an ending newline: curl -w 'We downloaded %{size_download} bytes\n' www.download.com

    KERBEROS FTP TRANSFER

    Curl supports kerberos4 and kerberos5/GSSAPI for FTP transfers. You need the kerberos package installed and used at curl build time for it to be used. First, get the krb-ticket the normal way, like with the kinit/kauth tool. Then use curl in way similar to: curl --krb private ftp://krb4site.com -u username:fakepwd

    There's no use for a password on the -u switch, but a blank one will make curl ask for one and you already entered the real password to kinit/kauth.

    TELNET

    curl telnet://remote.server.com

    enter the data to pass to the server on stdin. The result will be sent to stdout or to the file you specify with -o.

    Use the -N/--no-buffer. to switch off the buffered output for slow connections or similar.

    Pass options to the telnet protocol negotiation, by using the -t use a vt100 terminal, like: curl -tTTYPE=vt100 telnet://remote.server.com

    Other options -t include: - XDISPLOC=X display Sets the X display location. - NEW_ENV=var,val Sets an environment variable.

    PERSISTENT CONNECTIONS

    Specifying multiple files on a single command line will make curl transfer all of them, one after the other in the specified order. libcurl will attempt to use persistent connections for the transfers so that the second transfer to the same host can use the same connection that was already initiated and was left open in the previous transfer. This greatly decreases connection time for all but the first transfer and it makes a far better use of the network. Note that curl cannot use persistent connections for transfers that are used in subsequence curl invokes. Try to stuff as many URLs as possible on the same command line if they are using the same host, as that'll make the transfers faster. If you use a http proxy for file transfers, practically all transfers will be persistent. MULTIPLE TRANSFERS WITH A SINGLE COMMAND LINE As is mentioned above, you can download multiple files with one command line by simply adding more URLs. If you want those to get saved to a local file instead of just printed to stdout, you need to add one save option for each URL you specify. Note that this also goes for the -O (but not --remote-name-all). For example: get two files and use -O for the first and a custom file name for the second: curl -O http://url.com/file.txt ftp://ftp.com/moo.exe -o moo.jpg

    You can also upload multiple files in a similar fashion: curl -T local1 ftp://ftp.com/moo.exe -T local2 ftp://ftp.com/moo2.txt


    Example with verbose:
    curl -v https://ruuvi.slack.com/files/U0AGHBZFU/F7FK1B2L9/ruuvitag-full.zip
    *  Trying 52.84.32.203...
    * TCP_NODELAY set
    * Connected to ruuvi.slack.com (52.84.32.203) port 443 (#0)
    * ALPN, offering h2
    * ALPN, offering http/1.1
    * Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
    * successfully set certificate verify locations:
    *  CAfile: /etc/ssl/cert.pem
     CApath: none
    * TLSv1.2 (OUT), TLS handshake, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Server hello (2):
    * TLSv1.2 (IN), TLS handshake, Certificate (11):
    * TLSv1.2 (IN), TLS handshake, Server key exchange (12):
    * TLSv1.2 (IN), TLS handshake, Server finished (14):
    * TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
    * TLSv1.2 (OUT), TLS change cipher, Client hello (1):
    * TLSv1.2 (OUT), TLS handshake, Finished (20):
    * TLSv1.2 (IN), TLS change cipher, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Finished (20):
    * SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
    * ALPN, server accepted to use h2
    * Server certificate:
    * subject: C=US; ST=California; L=San Francisco; O=Slack Technologies, Inc.; CN=*.slack.com
    * start date: Feb 1 00:00:00 2017 GMT
    * expire date: Feb 1 23:59:59 2019 GMT
    * subjectAltName: host "ruuvi.slack.com" matched cert's "*.slack.com"
    * issuer: C=US; O=GeoTrust Inc.; CN=GeoTrust SSL CA - G3
    * SSL certificate verify ok.
    * Using HTTP2, server supports multi-use
    * Connection state changed (HTTP/2 confirmed)
    * Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
    * Using Stream ID: 1 (easy handle 0x7fb07c005800)
    > GET /files/U0AGHBZFU/F7FK1B2L9/ruuvitag-full.zip HTTP/2
    > Host: ruuvi.slack.com
    > User-Agent: curl/7.54.0
    > Accept: */*
    >
    * Connection state changed (MAX_CONCURRENT_STREAMS updated)!
    > HTTP/2 302
    > content-type: text/html
    > content-length: 0
    > location: https://ruuvi.slack.com/?redir=%2Ffiles%2FU0AGHBZFU%2FF7FK1B2L9%2Fruuvitag-full.zip
    > date: Wed, 11 Oct 2017 11:56:17 GMT
    > referrer-policy: no-referrer
    > server: Apache
    > set-cookie: b=e9god2rphvcwwkc08ccoowsgg; expires=Mon, 11-Oct-2027 11:56:17 GMT; Max-Age=315532800; path=/; domain=.slack.com
    > strict-transport-security: max-age=31536000; includeSubDomains; preload
    > vary: Accept-Encoding
    > x-frame-options: SAMEORIGIN
    > x-robots-tag: noindex
    > x-slack-backend: h
    > x-cache: Miss from cloudfront
    > via: 1.1 ac094a1c1bf8cbfbb98e93fa2b2431c0.cloudfront.net (CloudFront)
    > x-amz-cf-id: 4-Uk7ft2UlMeOon7gN1a_KNRniqh5nHnmV5UKO2dDvCl-vAidzy_tA==
    >
    * Connection #0 to host ruuvi.slack.com left intact
    

    IPv6

    curl will connect to a server with IPv6 when a host lookup returns an IPv6 address and fall back to IPv4 if the connection fails. The --ipv4 and --ipv6 can specify which address to use when both are available. IPv6 addresses can also be specified directly in URLs using the syntax: http://[2001:1890:1112:1::20]/overview.html

    When this style is used, the -g must be given to stop curl from interpreting the square brackets as special globbing characters. Link local and site local addresses including a scope identifier, such as fe80::1234%1, may also be used, but the scope portion must be numeric and the percent character must be URL escaped. The previous example in an SFTP URL might look like: sftp://[fe80::1234%251]/

    IPv6 addresses provided other than in URLs (e.g. to the --proxy, --interface or --ftp-port options) should not be URL encoded.

    URL detailed description in RFC 3986.

    Firewallso

    Attempts to access
    Blocked IN=eth1 OUT= MAC=20:c0:47:c2:a8:a4:f4:b5:2f:05:38:c7:08:00 
        SRC=15.72.34.52 DST=10.3.22.4 LEN=121 TOS=00 PREC=0x00 TTL=56 ID=16333 DF 
        PROTO=TCP SPT=443 DPT=13952 SEQ=1794043383 ACK=2151756732 WINDOW=23 ACK PSH URGP=0 MARK=0