blob: dde24867ade3ff338f9036ed7c6722f4d3bd9688 [file] [log] [blame]
/* NEVER EVER edit this manually, fix the mkhelp script instead! */
#include <stdio.h>
void hugehelp(void)
{
puts (
" _ _ ____ _ \n"
" Project ___| | | | _ \\| | \n"
" / __| | | | |_) | | \n"
" | (__| |_| | _ <| |___ \n"
" \\___|\\___/|_| \\_\\_____|\n"
"NAME\n"
" curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT,\n"
" FILE, HTTP or HTTPS syntax.\n"
"\n"
"SYNOPSIS\n"
" curl [options] url\n"
"\n"
"DESCRIPTION\n"
" curl is a client to get documents/files from servers,\n"
" using any of the supported protocols. The command is\n"
" designed to work without user interaction or any kind of\n"
" interactivity.\n"
"\n"
" curl offers a busload of useful tricks like proxy support,\n"
" user authentication, ftp upload, HTTP post, SSL (https:)\n"
" connections, cookies, file transfer resume and more.\n"
"\n"
"URL\n"
" The URL syntax is protocol dependent. You'll find a\n"
" detailed description in RFC 2396.\n"
"\n"
" You can specify multiple URLs or parts of URLs by writing\n"
" part sets within braces as in:\n"
"\n"
" http://site.{one,two,three}.com\n"
"\n"
" or you can get sequences of alphanumeric series by using\n"
" [] as in:\n"
"\n"
" ftp://ftp.numericals.com/file[1-100].txt\n"
" ftp://ftp.numericals.com/file[001-100].txt (with lead-\n"
" ing zeros)\n"
" ftp://ftp.letters.com/file[a-z].txt\n"
"\n"
" It is possible to specify up to 9 sets or series for a\n"
" URL, but no nesting is supported at the moment:\n"
"\n"
" http://www.any.org/archive[1996-1999]/vol-\n"
" ume[1-4]part{a,b,c,index}.html\n"
"\n"
"OPTIONS\n"
" -a/--append\n"
" (FTP) When used in a ftp upload, this will tell\n"
" curl to append to the target file instead of over-\n"
" writing it. If the file doesn't exist, it will be\n"
" created.\n"
"\n"
" -A/--user-agent <agent string>\n"
" (HTTP) Specify the User-Agent string to send to the\n"
" HTTP server. Some badly done CGIs fail if its not\n"
" set to \"Mozilla/4.0\". To encode blanks in the\n"
" string, surround the string with single quote\n"
" marks. This can also be set with the -H/--header\n"
" flag of course.\n"
" -b/--cookie <name=data>\n"
" (HTTP) Pass the data to the HTTP server as a\n"
" cookie. It is supposedly the data previously\n"
" received from the server in a \"Set-Cookie:\" line.\n"
" The data should be in the format \"NAME1=VALUE1;\n"
" NAME2=VALUE2\".\n"
"\n"
" If no '=' letter is used in the line, it is treated\n"
" as a filename to use to read previously stored\n"
" cookie lines from, which should be used in this\n"
" session if they match. Using this method also acti-\n"
" vates the \"cookie parser\" which will make curl\n"
" record incoming cookies too, which may be handy if\n"
" you're using this in combination with the\n"
" -L/--location option. The file format of the file\n"
" to read cookies from should be plain HTTP headers\n"
" or the netscape cookie file format.\n"
"\n"
" NOTE that the file specified with -b/--cookie is\n"
" only used as input. No cookies will be stored in\n"
" the file. To store cookies, save the HTTP headers\n"
" to a file using -D/--dump-header!\n"
"\n"
" -B/--ftp-ascii\n"
" (FTP/LDAP) Use ASCII transfer when getting an FTP\n"
" file or LDAP info. For FTP, this can also be\n"
" enforced by using an URL that ends with \";type=A\".\n"
"\n"
" -c/--continue\n"
" Continue/Resume a previous file transfer. This\n"
" instructs curl to continue appending data on the\n"
" file where it was previously left, possibly because\n"
" of a broken connection to the server. There must be\n"
" a named physical file to append to for this to\n"
" work. Note: Upload resume is depening on a command\n"
" named SIZE not always present in all ftp servers!\n"
" Upload resume is for FTP only. HTTP resume is only\n"
" possible with HTTP/1.1 or later servers.\n"
"\n"
" -C/--continue-at <offset>\n"
" Continue/Resume a previous file transfer at the\n"
" given offset. The given offset is the exact number\n"
" of bytes that will be skipped counted from the\n"
" beginning of the source file before it is trans-\n"
" fered to the destination. If used with uploads,\n"
" the ftp server command SIZE will not be used by\n"
" curl. Upload resume is for FTP only. HTTP resume\n"
" is only possible with HTTP/1.1 or later servers.\n"
"\n"
" -d/--data <data>\n"
" (HTTP) Sends the specified data in a POST request\n"
" to the HTTP server. Note that the data is sent\n"
" exactly as specified with no extra processing. The\n"
" data is expected to be \"url-encoded\". This will\n"
" cause curl to pass the data to the server using the\n"
" content-type application/x-www-form-urlencoded.\n"
" Compare to -F.\n"
"\n"
" If you start the data with the letter @, the rest\n"
" should be a file name to read the data from, or -\n"
" if you want curl to read the data from stdin. The\n"
" contents of the file must already be url-encoded.\n"
"\n"
" -D/--dump-header <file>\n"
" (HTTP/FTP) Write the HTTP headers to this file.\n"
" Write the FTP file info to this file if -I/--head\n"
" is used.\n"
"\n"
" This option is handy to use when you want to store\n"
" the cookies that a HTTP site sends to you. The\n"
" cookies could then be read in a second curl invoke\n"
" by using the -b/--cookie option!\n"
"\n"
" -e/--referer <URL>\n"
" (HTTP) Sends the \"Referer Page\" information to the\n"
" HTTP server. Some badly done CGIs fail if it's not\n"
" set. This can also be set with the -H/--header flag\n"
" of course.\n"
"\n"
" -E/--cert <certificate[:password]>\n"
" (HTTPS) Tells curl to use the specified certificate\n"
" file when getting a file with HTTPS. The certifi-\n"
" cate must be in PEM format. If the optional pass-\n"
" word isn't specified, it will be queried for on the\n"
" terminal. Note that this certificate is the private\n"
" key and the private certificate concatenated!\n"
"\n"
" -f/--fail\n"
" (HTTP) Fail silently (no output at all) on server\n"
" errors. This is mostly done like this to better\n"
" enable scripts etc to better deal with failed\n"
" attempts. In normal cases when a HTTP server fails\n"
" to deliver a document, it returns a HTML document\n"
" stating so (which often also describes why and\n"
" more). This flag will prevent curl from outputting\n"
" that and fail silently instead.\n"
"\n"
" -F/--form <name=content>\n"
" (HTTP) This lets curl emulate a filled in form in\n"
" which a user has pressed the submit button. This\n"
" causes curl to POST data using the content-type\n"
" multipart/form-data according to RFC1867. This\n"
" enables uploading of binary files etc. To force the\n"
" 'content' part to be read from a file, prefix the\n"
" file name with an @ sign. Example, to send your\n"
" password file to the server, where 'password' is\n"
" the name of the form-field to which /etc/passwd\n"
" will be the input:\n"
" curl -F password=@/etc/passwd www.mypasswords.com\n"
"\n"
" To read the file's content from stdin insted of a\n"
" file, use - where the file name should've been.\n"
"\n"
" -h/--help\n"
" Usage help.\n"
"\n"
" -H/--header <header>\n"
" (HTTP) Extra header to use when getting a web page.\n"
" You may specify any number of extra headers. Note\n"
" that if you should add a custom header that has the\n"
" same name as one of the internal ones curl would\n"
" use, your externally set header will be used\n"
" instead of the internal one. This allows you to\n"
" make even trickier stuff than curl would normally\n"
" do. You should not replace internally set headers\n"
" without knowing perfectly well what you're doing.\n"
"\n"
" -i/--include\n"
" (HTTP) Include the HTTP-header in the output. The\n"
" HTTP-header includes things like server-name, date\n"
" of the document, HTTP-version and more...\n"
"\n"
" -I/--head\n"
" (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n"
" feature the command HEAD which this uses to get\n"
" nothing but the header of a document. When used on\n"
" a FTP file, curl displays the file size only.\n"
"\n"
" -K/--config <config file>\n"
" Specify which config file to read curl arguments\n"
" from. The config file is a text file in which com-\n"
" mand line arguments can be written which then will\n"
" be used as if they were written on the actual com-\n"
" mand line. If the first column of a config line is\n"
" a '#' character, the rest of the line will be\n"
" treated as a comment.\n"
"\n"
" Specify the filename as '-' to make curl read the\n"
" file from stdin.\n"
"\n"
" -l/--list-only\n"
" (FTP) When listing an FTP directory, this switch\n"
" forces a name-only view. Especially useful if you\n"
" want to machine-parse the contents of an FTP direc-\n"
" tory since the normal directory view doesn't use a\n"
" standard look or format.\n"
"\n"
" -L/--location\n"
" (HTTP/HTTPS) If the server reports that the\n"
" requested page has a different location (indicated\n"
" with the header line Location:) this flag will let\n"
" curl attempt to reattempt the get on the new place.\n"
" If used together with -i or -I, headers from all\n"
" requested pages will be shown.\n"
"\n"
" -m/--max-time <seconds>\n"
" Maximum time in seconds that you allow the whole\n"
" operation to take. This is useful for preventing\n"
" your batch jobs from hanging for hours due to slow\n"
" networks or links going down. This doesn't work\n"
" properly in win32 systems.\n"
"\n"
" -M/--manual\n"
" Manual. Display the huge help text.\n"
"\n"
" -n/--netrc\n"
" Makes curl scan the .netrc file in the user's home\n"
" directory for login name and password. This is typ-\n"
" ically used for ftp on unix. If used with http,\n"
" curl will enable user authentication. See netrc(5)\n"
" for details on the file format. Curl will not com-\n"
" plain if that file hasn't the right permissions (it\n"
" should not be world nor group readable). The envi-\n"
" ronment variable \"HOME\" is used to find the home\n"
" directory.\n"
"\n"
" A quick and very simple example of how to setup a\n"
" .netrc to allow curl to ftp to the machine\n"
" host.domain.com with user name\n"
"\n"
" machine host.domain.com user myself password secret\n"
"\n"
" -N/--no-buffer\n"
" Disables the buffering of the output stream. In\n"
" normal work situations, curl will use a standard\n"
" buffered output stream that will have the effect\n"
" that it will output the data in chunks, not neces-\n"
" sarily exactly when the data arrives. Using this\n"
" option will disable that buffering.\n"
"\n"
" -o/--output <file>\n"
" Write output to <file> instead of stdout. If you\n"
" are using {} or [] to fetch multiple documents, you\n"
" can use '#' followed by a number in the <file>\n"
" specifier. That variable will be replaced with the\n"
" current string for the URL being fetched. Like in:\n"
"\n"
" curl http://{one,two}.site.com -o \"file_#1.txt\"\n"
"\n"
" or use several variables like:\n"
"\n"
" curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n"
"\n"
" -O/--remote-name\n"
" Write output to a local file named like the remote\n"
" file we get. (Only the file part of the remote file\n"
" is used, the path is cut off.)\n"
"\n"
" -P/--ftpport <address>\n"
" (FTP) Reverses the initiator/listener roles when\n"
" connecting with ftp. This switch makes Curl use the\n"
" PORT command instead of PASV. In practice, PORT\n"
" tells the server to connect to the client's speci-\n"
" fied address and port, while PASV asks the server\n"
" for an ip address and port to connect to. <address>\n"
" should be one of:\n"
"\n"
" interface i.e \"eth0\" to specify which interface's\n"
" IP address you want to use (Unix only)\n"
"\n"
" IP address i.e \"192.168.10.1\" to specify exact IP\n"
" number\n"
"\n"
" host name i.e \"my.host.domain\" to specify machine\n"
"\n"
" - (any single-letter string) to make it\n"
" pick the machine's default\n"
"\n"
" -q If used as the first parameter on the command line,\n"
" the $HOME/.curlrc file will not be read and used as\n"
" a config file.\n"
"\n"
" -Q/--quote <comand>\n"
" (FTP) Send an arbitrary command to the remote FTP\n"
" server, by using the QUOTE command of the server.\n"
" Not all servers support this command, and the set\n"
" of QUOTE commands are server specific! Quote com-\n"
" mands are sent BEFORE the transfer is taking place.\n"
" To make commands take place after a successful\n"
" transfer, prefix them with a dash '-'. You may\n"
" specify any amount of commands to be run before and\n"
" after the transfer. If the server returns failure\n"
" for one of the commands, the entire operation will\n"
" be aborted.\n"
"\n"
" -r/--range <range>\n"
" (HTTP/FTP) Retrieve a byte range (i.e a partial\n"
" document) from a HTTP/1.1 or FTP server. Ranges can\n"
" be specified in a number of ways.\n"
"\n"
" 0-499 specifies the first 500 bytes\n"
"\n"
" 500-999 specifies the second 500 bytes\n"
"\n"
" -500 specifies the last 500 bytes\n"
"\n"
" 9500 specifies the bytes from offset 9500 and\n"
" forward\n"
"\n"
" 0-0,-1 specifies the first and last byte\n"
" only(*)(H)\n"
"\n"
" 500-700,600-799\n"
" specifies 300 bytes from offset 500(H)\n"
"\n"
" 100-199,500-599\n"
" specifies two separate 100 bytes\n"
" ranges(*)(H)\n"
"\n"
" (*) = NOTE that this will cause the server to reply with a\n"
" multipart response!\n"
"\n"
" You should also be aware that many HTTP/1.1 servers do not\n"
" have this feature enabled, so that when you attempt to get\n"
" a range, you'll instead get the whole document.\n"
"\n"
" FTP range downloads only support the simple syntax 'start-\n"
" stop' (optionally with one of the numbers omitted). It\n"
" depends on the non-RFC command SIZE.\n"
"\n"
" -s/--silent\n"
" Silent mode. Don't show progress meter or error\n"
" messages. Makes Curl mute.\n"
"\n"
" -S/--show-error\n"
" When used with -s it makes curl show error message\n"
" if it fails.\n"
"\n"
" -t/--upload\n"
" Transfer the stdin data to the specified file. Curl\n"
" will read everything from stdin until EOF and store\n"
" with the supplied name. If this is used on a\n"
" http(s) server, the PUT command will be used.\n"
"\n"
" -T/--upload-file <file>\n"
" Like -t, but this transfers the specified local\n"
" file. If there is no file part in the specified\n"
" URL, Curl will append the local file name. NOTE\n"
" that you must use a trailing / on the last direc-\n"
" tory to really prove to Curl that there is no file\n"
" name or curl will think that your last directory\n"
" name is the remote file name to use. That will most\n"
" likely cause the upload operation to fail. If this\n"
" is used on a http(s) server, the PUT command will\n"
" be used.\n"
"\n"
" -u/--user <user:password>\n"
" Specify user and password to use when fetching. See\n"
" README.curl for detailed examples of how to use\n"
" this. If no password is specified, curl will ask\n"
" for it interactively.\n"
"\n"
" -U/--proxy-user <user:password>\n"
" Specify user and password to use for Proxy\n"
" authentication. If no password is specified, curl\n"
" will ask for it interactively.\n"
"\n"
" -v/--verbose\n"
" Makes the fetching more verbose/talkative. Mostly\n"
" usable for debugging. Lines starting with '>' means\n"
" data sent by curl, '<' means data received by curl\n"
" that is hidden in normal cases and lines starting\n"
" with '*' means additional info provided by curl.\n"
"\n"
" -V/--version\n"
" Displays the full version of curl, libcurl and\n"
" other 3rd party libraries linked with the exe-\n"
" cutable.\n"
"\n"
" -w/--write-out <format>\n"
" Defines what to display after a completed and suc-\n"
" cessful operation. The format is a string that may\n"
" contain plain text mixed with any number of vari-\n"
" ables. The string can be specified as \"string\", to\n"
" get read from a particular file you specify it\n"
" \"@filename\" and to tell curl to read the format\n"
" from stdin you write \"@-\".\n"
"\n"
" The variables present in the output format will be\n"
" substituted by the value or text that curl thinks\n"
" fit, as described below. All variables are speci-\n"
" fied like %{variable_name} and to output a normal %\n"
" you just write them like %%. You can output a new-\n"
" line by using \\n, a carrige return with \\r and a\n"
" tab space with \\t.\n"
"\n"
" NOTE: The %-letter is a special letter in the\n"
" win32-environment, where all occurrences of % must\n"
" be doubled when using this option.\n"
"\n"
" Available variables are at this point:\n"
"\n"
" url_effective The URL that was fetched last. This\n"
" is mostly meaningful if you've told\n"
" curl to follow location: headers.\n"
"\n"
" http_code The numerical code that was found in\n"
" the last retrieved HTTP(S) page.\n"
"\n"
" time_total The total time, in seconds, that the\n"
" full operation lasted. The time will\n"
" be displayed with millisecond reso-\n"
" lution.\n"
"\n"
" time_namelookup\n"
" The time, in seconds, it took from\n"
" the start until the name resolving\n"
" was completed.\n"
" time_connect The time, in seconds, it took from\n"
" the start until the connect to the\n"
" remote host (or proxy) was com-\n"
" pleted.\n"
"\n"
" time_pretransfer\n"
" The time, in seconds, it took from\n"
" the start until the file transfer is\n"
" just about to begin. This includes\n"
" all pre-transfer commands and nego-\n"
" tiations that are specific to the\n"
" particular protocol(s) involved.\n"
"\n"
" size_download The total amount of bytes that were\n"
" downloaded.\n"
"\n"
" size_upload The total amount of bytes that were\n"
" uploaded.\n"
"\n"
" speed_download The average download speed that curl\n"
" measured for the complete download.\n"
"\n"
" speed_upload The average upload speed that curl\n"
" measured for the complete download.\n"
"\n"
" -x/--proxy <proxyhost[:port]>\n"
" Use specified proxy. If the port number is not\n"
" specified, it is assumed at port 1080.\n"
"\n"
" -X/--request <command>\n"
" (HTTP) Specifies a custom request to use when com-\n"
" municating with the HTTP server. The specified\n"
" request will be used instead of the standard GET.\n"
" Read the HTTP 1.1 specification for details and\n"
" explanations.\n"
"\n"
" (FTP) Specifies a custom FTP command to use instead\n"
" of LIST when doing file lists with ftp.\n"
"\n"
" -y/--speed-time <time>\n"
" If a download is slower than speed-limit bytes per\n"
" second during a speed-time period, the download\n"
" gets aborted. If speed-time is used, the default\n"
" speed-limit will be 1 unless set with -y.\n"
"\n"
" -Y/--speed-limit <speed>\n"
" If a download is slower than this given speed, in\n"
" bytes per second, for speed-time seconds it gets\n"
" aborted. speed-time is set with -Y and is 30 if not\n"
" set.\n"
"\n"
" -z/--time-cond <date expression>\n"
" (HTTP) Request to get a file that has been modified\n"
" later than the given time and date, or one that has\n"
" been modified before that time. The date expression\n"
" can be all sorts of date strings or if it doesn't\n"
" match any internal ones, it tries to get the time\n"
" from a given file name instead! See the GNU date(1)\n"
" man page for date expression details.\n"
"\n"
" Start the date expression with a dash (-) to make\n"
" it request for a document that is older than the\n"
" given date/time, default is a document that is\n"
" newer than the specified date/time.\n"
"\n"
" -3/--sslv3\n"
" (HTTPS) Forces curl to use SSL version 3 when nego-\n"
" tiating with a remote SSL server.\n"
"\n"
" -2/--sslv2\n"
" (HTTPS) Forces curl to use SSL version 2 when nego-\n"
" tiating with a remote SSL server.\n"
"\n"
" -#/--progress-bar\n"
" Make curl display progress information as a\n"
" progress bar instead of the default statistics.\n"
"\n"
" --crlf (FTP) Convert LF to CRLF in upload. Useful for MVS\n"
" (OS/390).\n"
"\n"
" --stderr <file>\n"
" Redirect all writes to stderr to the specified file\n"
" instead. If the file name is a plain '-', it is\n"
" instead written to stdout. This option has no point\n"
" when you're using a shell with decent redirecting\n"
" capabilities.\n"
"\n"
"FILES\n"
" ~/.curlrc\n"
" Default config file.\n"
"\n"
"ENVIRONMENT\n"
" HTTP_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for HTTP.\n"
"\n"
" HTTPS_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for HTTPS.\n"
"\n"
" FTP_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for FTP.\n"
"\n"
" GOPHER_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for GOPHER.\n"
"\n"
" ALL_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use if no protocol-specific\n"
" proxy is set.\n"
" NO_PROXY <comma-separated list of hosts>\n"
" list of host names that shouldn't go through any\n"
" proxy. If set to a asterisk '*' only, it matches\n"
" all hosts.\n"
"\n"
" COLUMNS <integer>\n"
" The width of the terminal. This variable only\n"
" affects curl when the --progress-bar option is\n"
" used.\n"
"\n"
"EXIT CODES\n"
" There exists a bunch of different error codes and their\n"
" corresponding error messages that may appear during bad\n"
" conditions. At the time of this writing, the exit codes\n"
" are:\n"
"\n"
" 1 Unsupported protocol. This build of curl has no\n"
" support for this protocol.\n"
"\n"
" 2 Failed to initialize.\n"
"\n"
" 3 URL malformat. The syntax was not correct.\n"
"\n"
" 4 URL user malformatted. The user-part of the URL\n"
" syntax was not correct.\n"
"\n"
" 5 Couldn't resolve proxy. The given proxy host could\n"
" not be resolved.\n"
"\n"
" 6 Couldn't resolve host. The given remote host was\n"
" not resolved.\n"
"\n"
" 7 Failed to connect to host.\n"
"\n"
" 8 FTP weird server reply. The server sent data curl\n"
" couldn't parse.\n"
"\n"
" 9 FTP access denied. The server denied login.\n"
"\n"
" 10 FTP user/password incorrect. Either one or both\n"
" were not accepted by the server.\n"
"\n"
" 11 FTP weird PASS reply. Curl couldn't parse the reply\n"
" sent to the PASS request.\n"
"\n"
" 12 FTP weird USER reply. Curl couldn't parse the reply\n"
" sent to the USER request.\n"
"\n"
" 13 FTP weird PASV reply, Curl couldn't parse the reply\n"
" sent to the PASV request.\n"
"\n"
" 14 FTP weird 227 formay. Curl couldn't parse the\n"
" 227-line the server sent.\n"
" 15 FTP can't get host. Couldn't resolve the host IP we\n"
" got in the 227-line.\n"
"\n"
" 16 FTP can't reconnect. Couldn't connect to the host\n"
" we got in the 227-line.\n"
"\n"
" 17 FTP couldn't set binary. Couldn't change transfer\n"
" method to binary.\n"
"\n"
" 18 Partial file. Only a part of the file was trans-\n"
" fered.\n"
"\n"
" 19 FTP couldn't RETR file. The RETR command failed.\n"
"\n"
" 20 FTP write error. The transfer was reported bad by\n"
" the server.\n"
"\n"
" 21 FTP quote error. A quote command returned error\n"
" from the server.\n"
"\n"
" 22 HTTP not found. The requested page was not found.\n"
" This return code only appears if --fail is used.\n"
"\n"
" 23 Write error. Curl couldn't write data to a local\n"
" filesystem or similar.\n"
"\n"
" 24 Malformat user. User name badly specified.\n"
"\n"
" 25 FTP couldn't STOR file. The server denied the STOR\n"
" operation.\n"
"\n"
" 26 Read error. Various reading problems.\n"
"\n"
" 27 Out of memory. A memory allocation request failed.\n"
"\n"
" 28 Operation timeout. The specified time-out period\n"
" was reached according to the conditions.\n"
"\n"
" 29 FTP couldn't set ASCII. The server returned an\n"
" unknown reply.\n"
"\n"
" 30 FTP PORT failed. The PORT command failed.\n"
"\n"
" 31 FTP couldn't use REST. The REST command failed.\n"
"\n"
" 32 FTP couldn't use SIZE. The SIZE command failed. The\n"
" command is an extension to the original FTP spec\n"
" RFC 959.\n"
"\n"
" 33 HTTP range error. The range \"command\" didn't work.\n"
"\n"
" 34 HTTP post error. Internal post-request generation\n"
" error.\n"
" 35 SSL connect error. The SSL handshaking failed.\n"
"\n"
" 36 FTP bad download resume. Couldn't continue an ear-\n"
" lier aborted download.\n"
"\n"
" 37 FILE couldn't read file. Failed to open the file.\n"
" Permissions?\n"
"\n"
" 38 LDAP cannot bind. LDAP bind operation failed.\n"
"\n"
" 39 LDAP search failed.\n"
"\n"
" 40 Library not found. The LDAP library was not found.\n"
"\n"
" 41 Function not found. A required LDAP function was\n"
" not found.\n"
"\n"
" XX There will appear more error codes here in future\n"
" releases. The existing ones are meant to never\n"
" change.\n"
"\n"
"BUGS\n"
" If you do find any (or have other suggestions), mail\n"
" Daniel Stenberg <Daniel.Stenberg@haxx.nu>.\n"
"\n"
"AUTHORS / CONTRIBUTORS\n"
" - Daniel Stenberg <Daniel.Stenberg@haxx.nu>\n"
" - Rafael Sagula <sagula@inf.ufrgs.br>\n"
" - Sampo Kellomaki <sampo@iki.fi>\n"
" - Linas Vepstas <linas@linas.org>\n"
" - Bjorn Reese <breese@mail1.stofanet.dk>\n"
" - Johan Anderson <johan@homemail.com>\n"
" - Kjell Ericson <Kjell.Ericson@haxx,nu>\n"
" - Troy Engel <tengel@sonic.net>\n"
" - Ryan Nelson <ryan@inch.com>\n"
" - Bjorn Stenberg <Bjorn.Stenberg@haxx.nu>\n"
" - Angus Mackay <amackay@gus.ml.org>\n"
" - Eric Young <eay@cryptsoft.com>\n"
" - Simon Dick <simond@totally.irrelevant.org>\n"
" - Oren Tirosh <oren@monty.hishome.net>\n"
" - Steven G. Johnson <stevenj@alum.mit.edu>\n"
" - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>\n"
" - Andr's Garc'a <ornalux@redestb.es>\n"
" - Douglas E. Wegscheid <wegscd@whirlpool.com>\n"
" - Mark Butler <butlerm@xmission.com>\n"
" - Eric Thelin <eric@generation-i.com>\n"
" - Marc Boucher <marc@mbsi.ca>\n"
" - Greg Onufer <Greg.Onufer@Eng.Sun.COM>\n"
" - Doug Kaufman <dkaufman@rahul.net>\n"
" - David Eriksson <david@2good.com>\n"
" - Ralph Beckmann <rabe@uni-paderborn.de>\n"
" - T. Yamada <tai@imasy.or.jp>\n"
" - Lars J. Aas <larsa@sim.no>\n"
" - J\"rn Hartroth <Joern.Hartroth@telekom.de>\n"
" - Matthew Clarke <clamat@van.maves.ca>\n"
" - Linus Nielsen <Linus.Nielsen@haxx.nu>\n"
" - Felix von Leitner <felix@convergence.de>\n"
" - Dan Zitter <dzitter@zitter.net>\n"
" - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n"
" - Chris Maltby <chris@aurema.com>\n"
" - Ron Zapp <rzapper@yahoo.com>\n"
" - Paul Marquis <pmarquis@iname.com>\n"
" - Ellis Pritchard <ellis@citria.com>\n"
" - Damien Adant <dams@usa.net>\n"
" - Chris <cbayliss@csc.come>\n"
" - Marco G. Salvagno <mgs@whiz.cjb.net>\n"
"\n"
"WWW\n"
" http://curl.haxx.nu\n"
"\n"
"FTP\n"
" ftp://ftp.sunet.se/pub/www/utilities/curl/\n"
"\n"
"SEE ALSO\n"
" ftp(1), wget(1), snarf(1)\n"
"\n"
"LATEST VERSION\n"
"\n"
" You always find news about what's going on as well as the latest versions\n"
" from the curl web pages, located at:\n"
"\n"
" http://curl.haxx.nu\n"
"\n"
"SIMPLE USAGE\n"
"\n"
" Get the main page from netscape's web-server:\n"
"\n"
" curl http://www.netscape.com/\n"
"\n"
" Get the root README file from funet's ftp-server:\n"
"\n"
" curl ftp://ftp.funet.fi/README\n"
"\n"
" Get a gopher document from funet's gopher server:\n"
"\n"
" curl gopher://gopher.funet.fi\n"
"\n"
" Get a web page from a server using port 8000:\n"
"\n"
" curl http://www.weirdserver.com:8000/\n"
"\n"
" Get a list of the root directory of an FTP site:\n"
"\n"
" curl ftp://ftp.fts.frontec.se/\n"
"\n"
" Get the definition of curl from a dictionary:\n"
"\n"
" curl dict://dict.org/m:curl\n"
"\n"
"DOWNLOAD TO A FILE\n"
"\n"
" Get a web page and store in a local file:\n"
"\n"
" curl -o thatpage.html http://www.netscape.com/\n"
"\n"
" Get a web page and store in a local file, make the local file get the name\n"
" of the remote document (if no file name part is specified in the URL, this\n"
" will fail):\n"
"\n"
" curl -O http://www.netscape.com/index.html\n"
"\n"
"USING PASSWORDS\n"
"\n"
" FTP\n"
"\n"
" To ftp files using name+passwd, include them in the URL like:\n"
"\n"
" curl ftp://name:passwd@machine.domain:port/full/path/to/file\n"
"\n"
" or specify them with the -u flag like\n"
"\n"
" curl -u name:passwd ftp://machine.domain:port/full/path/to/file\n"
"\n"
" HTTP\n"
"\n"
" The HTTP URL doesn't support user and password in the URL string. Curl\n"
" does support that anyway to provide a ftp-style interface and thus you can\n"
" pick a file like:\n"
"\n"
" curl http://name:passwd@machine.domain/full/path/to/file\n"
"\n"
" or specify user and password separately like in\n"
"\n"
" curl -u name:passwd http://machine.domain/full/path/to/file\n"
"\n"
" NOTE! Since HTTP URLs don't support user and password, you can't use that\n"
" style when using Curl via a proxy. You _must_ use the -u style fetch\n"
" during such circumstances.\n"
"\n"
" HTTPS\n"
"\n"
" Probably most commonly used with private certificates, as explained below.\n"
"\n"
" GOPHER\n"
"\n"
" Curl features no password support for gopher.\n"
"\n"
"PROXY\n"
"\n"
" Get an ftp file using a proxy named my-proxy that uses port 888:\n"
"\n"
" curl -x my-proxy:888 ftp://ftp.leachsite.com/README\n"
"\n"
" Get a file from a HTTP server that requires user and password, using the\n"
" same proxy as above:\n"
"\n"
" curl -u user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" Some proxies require special authentication. Specify by using -U as above:\n"
"\n"
" curl -U user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" See also the environment variables Curl support that offer further proxy\n"
" control.\n"
"\n"
"RANGES\n"
"\n"
" With HTTP 1.1 byte-ranges were introduced. Using this, a client can request\n"
" to get only one or more subparts of a specified document. Curl supports\n"
" this with the -r flag.\n"
"\n"
" Get the first 100 bytes of a document:\n"
"\n"
" curl -r 0-99 http://www.get.this/\n"
"\n"
" Get the last 500 bytes of a document:\n"
"\n"
" curl -r -500 http://www.get.this/\n"
"\n"
" Curl also supports simple ranges for FTP files as well. Then you can only\n"
" specify start and stop position.\n"
"\n"
" Get the first 100 bytes of a document using FTP:\n"
"\n"
" curl -r 0-99 ftp://www.get.this/README \n"
"\n"
"UPLOADING\n"
"\n"
" FTP\n"
"\n"
" Upload all data on stdin to a specified ftp site:\n"
"\n"
" curl -t ftp://ftp.upload.com/myfile\n"
"\n"
" Upload data from a specified file, login with user and password:\n"
"\n"
" curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile\n"
"\n"
" Upload a local file to the remote site, and use the local file name remote\n"
" too:\n"
" \n"
" curl -T uploadfile -u user:passwd ftp://ftp.upload.com/\n"
"\n"
" Upload a local file to get appended to the remote file using ftp:\n"
"\n"
" curl -T localfile -a ftp://ftp.upload.com/remotefile\n"
"\n"
" NOTE: Curl does not support ftp upload through a proxy! The reason for this\n"
" is simply that proxies are seldomly configured to allow this and that no\n"
" author has supplied code that makes it possible!\n"
"\n"
" HTTP\n"
"\n"
" Upload all data on stdin to a specified http site:\n"
"\n"
" curl -t http://www.upload.com/myfile\n"
"\n"
" Note that the http server must've been configured to accept PUT before this\n"
" can be done successfully.\n"
"\n"
" For other ways to do http data upload, see the POST section below.\n"
"\n"
"VERBOSE / DEBUG\n"
"\n"
" If curl fails where it isn't supposed to, if the servers don't let you\n"
" in, if you can't understand the responses: use the -v flag to get VERBOSE\n"
" fetching. Curl will output lots of info and all data it sends and\n"
" receives in order to let the user see all client-server interaction.\n"
"\n"
" curl -v ftp://ftp.upload.com/\n"
"\n"
"DETAILED INFORMATION\n"
"\n"
" Different protocols provide different ways of getting detailed information\n"
" about specific files/documents. To get curl to show detailed information\n"
" about a single file, you should use -I/--head option. It displays all\n"
" available info on a single file for HTTP and FTP. The HTTP information is a\n"
" lot more extensive.\n"
"\n"
" For HTTP, you can get the header information (the same as -I would show)\n"
" shown before the data by using -i/--include. Curl understands the\n"
" -D/--dump-header option when getting files from both FTP and HTTP, and it\n"
" will then store the headers in the specified file.\n"
"\n"
" Store the HTTP headers in a separate file:\n"
"\n"
" curl --dump-header headers.txt curl.haxx.nu\n"
"\n"
" Note that headers stored in a separate file can be very useful at a later\n"
" time if you want curl to use cookies sent by the server. More about that in\n"
" the cookies section.\n"
"\n"
"POST (HTTP)\n"
"\n"
" It's easy to post data using curl. This is done using the -d <data>\n"
" option. The post data must be urlencoded.\n"
"\n"
" Post a simple \"name\" and \"phone\" guestbook.\n"
"\n"
" curl -d \"name=Rafael%20Sagula&phone=3320780\" \\\n"
" http://www.where.com/guest.cgi\n"
"\n"
" While -d uses the application/x-www-form-urlencoded mime-type, generally\n"
" understood by CGI's and similar, curl also supports the more capable\n"
" multipart/form-data type. This latter type supports things like file upload.\n"
"\n"
" -F accepts parameters like -F \"name=contents\". If you want the contents to\n"
" be read from a file, use <@filename> as contents. When specifying a file,\n"
" you can also specify which content type the file is, by appending\n"
" ';type=<mime type>' to the file name. You can also post contents of several\n"
" files in one field. So that the field name 'coolfiles' can be sent three\n"
" files with different content types in a manner similar to:\n"
"\n"
" curl -F \"coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html\" \\\n"
" http://www.post.com/postit.cgi\n"
"\n"
" If content-type is not specified, curl will try to guess from the extension\n"
" (it only knows a few), or use the previously specified type (from an earlier\n"
" file if several files are specified in a list) or finally using the default\n"
" type 'text/plain'.\n"
"\n"
" Emulate a fill-in form with -F. Let's say you fill in three fields in a\n"
" form. One field is a file name which to post, one field is your name and one\n"
" field is a file description. We want to post the file we have written named\n"
" \"cooltext.txt\". To let curl do the posting of this data instead of your\n"
" favourite browser, you have to check out the HTML of the form page to get to\n"
" know the names of the input fields. In our example, the input field names are\n"
" 'file', 'yourname' and 'filedescription'.\n"
"\n"
" curl -F \"file=@cooltext.txt\" -F \"yourname=Daniel\" \\\n"
" -F \"filedescription=Cool text file with cool text inside\" \\\n"
" http://www.post.com/postit.cgi\n"
"\n"
" So, to send two files in one post you can do it in two ways:\n"
"\n"
" 1. Send multiple files in a single \"field\" with a single field name:\n"
" \n"
" curl -F \"pictures=@dog.gif,cat.gif\" \n"
" \n"
" 2. Send two fields with two field names: \n"
"\n"
" curl -F \"docpicture=@dog.gif\" -F \"catpicture=@cat.gif\" \n"
"\n"
"REFERER\n"
"\n"
" A HTTP request has the option to include information about which address\n"
" that referred to actual page, and curl allows the user to specify that\n"
" referrer to get specified on the command line. It is especially useful to\n"
" fool or trick stupid servers or CGI scripts that rely on that information\n"
" being available or contain certain data.\n"
"\n"
" curl -e www.coolsite.com http://www.showme.com/\n"
"\n"
"USER AGENT\n"
"\n"
" A HTTP request has the option to include information about the browser\n"
" that generated the request. Curl allows it to be specified on the command\n"
" line. It is especially useful to fool or trick stupid servers or CGI\n"
" scripts that only accept certain browsers.\n"
"\n"
" Example:\n"
"\n"
" curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/\n"
"\n"
" Other common strings:\n"
" 'Mozilla/3.0 (Win95; I)' Netscape Version 3 for Windows 95\n"
" 'Mozilla/3.04 (Win95; U)' Netscape Version 3 for Windows 95\n"
" 'Mozilla/2.02 (OS/2; U)' Netscape Version 2 for OS/2\n"
" 'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)' NS for AIX\n"
" 'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)' NS for Linux\n"
"\n"
" Note that Internet Explorer tries hard to be compatible in every way:\n"
" 'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)' MSIE for W95\n"
"\n"
" Mozilla is not the only possible User-Agent name:\n"
" 'Konqueror/1.0' KDE File Manager desktop client\n"
" 'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser\n"
"\n"
"COOKIES\n"
"\n"
" Cookies are generally used by web servers to keep state information at the\n"
" client's side. The server sets cookies by sending a response line in the\n"
" headers that looks like 'Set-Cookie: <data>' where the data part then\n"
" typically contains a set of NAME=VALUE pairs (separated by semicolons ';'\n"
" like \"NAME1=VALUE1; NAME2=VALUE2;\"). The server can also specify for what\n"
" path the \"cookie\" should be used for (by specifying \"path=value\"), when the\n"
" cookie should expire (\"expire=DATE\"), for what domain to use it\n"
" (\"domain=NAME\") and if it should be used on secure connections only\n"
" (\"secure\").\n"
"\n"
" If you've received a page from a server that contains a header like:\n"
" Set-Cookie: sessionid=boo123; path=\"/foo\";\n"
"\n"
" it means the server wants that first pair passed on when we get anything in\n"
" a path beginning with \"/foo\".\n"
"\n"
" Example, get a page that wants my name passed in a cookie:\n"
"\n"
" curl -b \"name=Daniel\" www.sillypage.com\n"
"\n"
" Curl also has the ability to use previously received cookies in following\n"
" sessions. If you get cookies from a server and store them in a file in a\n"
" manner similar to:\n"
"\n"
" curl --dump-header headers www.example.com\n"
"\n"
" ... you can then in a second connect to that (or another) site, use the\n"
" cookies from the 'headers' file like:\n"
"\n"
" curl -b headers www.example.com\n"
"\n"
" Note that by specifying -b you enable the \"cookie awareness\" and with -L\n"
" you can make curl follow a location: (which often is used in combination\n"
" with cookies). So that if a site sends cookies and a location, you can\n"
" use a non-existing file to trig the cookie awareness like:\n"
"\n"
" curl -L -b empty-file www.example.com\n"
"\n"
" The file to read cookies from must be formatted using plain HTTP headers OR\n"
" as netscape's cookie file. Curl will determine what kind it is based on the\n"
" file contents.\n"
"\n"
"PROGRESS METER\n"
"\n"
" The progress meter exists to show a user that something actually is\n"
" happening. The different fields in the output have the following meaning:\n"
"\n"
" % Total % Received % Xferd Average Speed Time Curr.\n"
" Dload Upload Total Current Left Speed\n"
" 0 151M 0 38608 0 0 9406 0 4:41:43 0:00:04 4:41:39 9287\n"
"\n"
" From left-to-right:\n"
" % - percentage completed of the whole transfer\n"
" Total - total size of the whole expected transfer\n"
" % - percentage completed of the download\n"
" Received - currently downloaded amount of bytes\n"
" % - percentage completed of the upload\n"
" Xferd - currently uploaded amount of bytes\n"
" Average Speed\n"
" Dload - the average transfer speed of the download\n"
" Average Speed\n"
" Upload - the average transfer speed of the upload\n"
" Time Total - expected time to complete the operation\n"
" Time Current - time passed since the invoke\n"
" Time Left - expected time left to completetion\n"
" Curr.Speed - the average transfer speed the last 5 seconds (the first\n"
" 5 seconds of a transfer is based on less time of course.)\n"
"\n"
" The -# option will display a totally different progress bar that doesn't\n"
" need much explanation!\n"
"\n"
"SPEED LIMIT\n"
"\n"
" Curl offers the user to set conditions regarding transfer speed that must\n"
" be met to let the transfer keep going. By using the switch -y and -Y you\n"
" can make curl abort transfers if the transfer speed doesn't exceed your\n"
" given lowest limit for a specified time.\n"
"\n"
" To let curl abandon downloading this page if its slower than 3000 bytes per\n"
" second for 1 minute, run:\n"
"\n"
" curl -y 3000 -Y 60 www.far-away-site.com\n"
"\n"
" This can very well be used in combination with the overall time limit, so\n"
" that the above operatioin must be completed in whole within 30 minutes:\n"
"\n"
" curl -m 1800 -y 3000 -Y 60 www.far-away-site.com\n"
"\n"
"CONFIG FILE\n"
"\n"
" Curl automatically tries to read the .curlrc file (or _curlrc file on win32\n"
" systems) from the user's home dir on startup. The config file should be\n"
" made up with normal command line switches. Comments can be used within the\n"
" file. If the first letter on a line is a '#'-letter the rest of the line\n"
" is treated as a comment.\n"
"\n"
" Example, set default time out and proxy in a config file:\n"
"\n"
" # We want a 30 minute timeout:\n"
" -m 1800\n"
" # ... and we use a proxy for all accesses:\n"
" -x proxy.our.domain.com:8080\n"
"\n"
" White spaces ARE significant at the end of lines, but all white spaces\n"
" leading up to the first characters of each line are ignored.\n"
"\n"
" Prevent curl from reading the default file by using -q as the first command\n"
" line parameter, like:\n"
"\n"
" curl -q www.thatsite.com\n"
"\n"
" Force curl to get and display a local help page in case it is invoked\n"
" without URL by making a config file similar to:\n"
"\n"
" # default url to get\n"
" http://help.with.curl.com/curlhelp.html\n"
"\n"
" You can specify another config file to be read by using the -K/--config\n"
" flag. If you set config file name to \"-\" it'll read the config from stdin,\n"
" which can be handy if you want to hide options from being visible in process\n"
" tables etc:\n"
"\n"
" echo \"-u user:passwd\" | curl -K - http://that.secret.site.com\n"
"\n"
"EXTRA HEADERS\n"
"\n"
" When using curl in your own very special programs, you may end up needing\n"
" to pass on your own custom headers when getting a web page. You can do\n"
" this by using the -H flag.\n"
"\n"
" Example, send the header \"X-you-and-me: yes\" to the server when getting a\n"
" page:\n"
"\n"
" curl -H \"X-you-and-me: yes\" www.love.com\n"
"\n"
" This can also be useful in case you want curl to send a different text in\n"
" a header than it normally does. The -H header you specify then replaces the\n"
" header curl would normally send.\n"
"\n"
"FTP and PATH NAMES\n"
"\n"
" Do note that when getting files with the ftp:// URL, the given path is\n"
" relative the directory you enter. To get the file 'README' from your home\n"
" directory at your ftp site, do:\n"
"\n"
" curl ftp://user:passwd@my.site.com/README\n"
"\n"
" But if you want the README file from the root directory of that very same\n"
" site, you need to specify the absolute file name:\n"
"\n"
" curl ftp://user:passwd@my.site.com//README\n"
"\n"
" (I.e with an extra slash in front of the file name.)\n"
"\n"
"FTP and firewalls\n"
"\n"
" The FTP protocol requires one of the involved parties to open a second\n"
" connction as soon as data is about to get transfered. There are two ways to\n"
" do this.\n"
"\n"
" The default way for curl is to issue the PASV command which causes the\n"
" server to open another port and await another connection performed by the\n"
" client. This is good if the client is behind a firewall that don't allow\n"
" incoming connections.\n"
"\n"
" curl ftp.download.com\n"
"\n"
" If the server for example, is behind a firewall that don't allow connections\n"
" on other ports than 21 (or if it just doesn't support the PASV command), the\n"
" other way to do it is to use the PORT command and instruct the server to\n"
" connect to the client on the given (as parameters to the PORT command) IP\n"
" number and port.\n"
"\n"
" The -P flag to curl allows for different options. Your machine may have\n"
" several IP-addresses and/or network interfaces and curl allows you to select\n"
" which of them to use. Default address can also be used:\n"
"\n"
" curl -P - ftp.download.com\n"
"\n"
" Download with PORT but use the IP address of our 'le0' interface:\n"
"\n"
" curl -P le0 ftp.download.com\n"
"\n"
" Download with PORT but use 192.168.0.10 as our IP address to use:\n"
"\n"
" curl -P 192.168.0.10 ftp.download.com\n"
"\n"
"HTTPS\n"
"\n"
" Secure HTTP requires SSL libraries to be installed and used when curl is\n"
" built. If that is done, curl is capable of retrieving and posting documents\n"
" using the HTTPS procotol.\n"
"\n"
" Example:\n"
"\n"
" curl https://www.secure-site.com\n"
"\n"
" Curl is also capable of using your personal certificates to get/post files\n"
" from sites that require valid certificates. The only drawback is that the\n"
" certificate needs to be in PEM-format. PEM is a standard and open format to\n"
" store certificates with, but it is not used by the most commonly used\n"
" browsers (Netscape and MSEI both use the so called PKCS#12 format). If you\n"
" want curl to use the certificates you use with your (favourite) browser, you\n"
" may need to download/compile a converter that can convert your browser's\n"
" formatted certificates to PEM formatted ones. This kind of converter is\n"
" included in recent versions of OpenSSL, and for older versions Dr Stephen\n"
" N. Henson has written a patch for SSLeay that adds this functionality. You\n"
" can get his patch (that requires an SSLeay installation) from his site at:\n"
" http://www.drh-consultancy.demon.co.uk/\n"
"\n"
" Example on how to automatically retrieve a document using a certificate with\n"
" a personal password:\n"
"\n"
" curl -E /path/to/cert.pem:password https://secure.site.com/\n"
"\n"
" If you neglect to specify the password on the command line, you will be\n"
" prompted for the correct password before any data can be received.\n"
"\n"
" Many older SSL-servers have problems with SSLv3 or TLS, that newer versions\n"
" of OpenSSL etc is using, therefore it is sometimes useful to specify what\n"
" SSL-version curl should use. Use -3 or -2 to specify that exact SSL version\n"
" to use:\n"
"\n"
" curl -2 https://secure.site.com/\n"
"\n"
" Otherwise, curl will first attempt to use v3 and then v2.\n"
"\n"
"RESUMING FILE TRANSFERS\n"
"\n"
" To continue a file transfer where it was previously aborted, curl supports\n"
" resume on http(s) downloads as well as ftp uploads and downloads.\n"
"\n"
" Continue downloading a document:\n"
"\n"
" curl -c -o file ftp://ftp.server.com/path/file\n"
"\n"
" Continue uploading a document(*1):\n"
"\n"
" curl -c -T file ftp://ftp.server.com/path/file\n"
"\n"
" Continue downloading a document from a web server(*2):\n"
"\n"
" curl -c -o file http://www.server.com/\n"
"\n"
" (*1) = This requires that the ftp server supports the non-standard command\n"
" SIZE. If it doesn't, curl will say so.\n"
"\n"
" (*2) = This requires that the wb server supports at least HTTP/1.1. If it\n"
" doesn't, curl will say so.\n"
"\n"
"TIME CONDITIONS\n"
"\n"
" HTTP allows a client to specify a time condition for the document it\n"
" requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to\n"
" specify them with the -z/--time-cond flag.\n"
"\n"
" For example, you can easily make a download that only gets performed if the\n"
" remote file is newer than a local copy. It would be made like:\n"
"\n"
" curl -z local.html http://remote.server.com/remote.html\n"
"\n"
" Or you can download a file only if the local file is newer than the remote\n"
" one. Do this by prepending the date string with a '-', as in:\n"
"\n"
" curl -z -local.html http://remote.server.com/remote.html\n"
"\n"
" You can specify a \"free text\" date as condition. Tell curl to only download\n"
" the file if it was updated since yesterday:\n"
"\n"
" curl -z yesterday http://remote.server.com/remote.html\n"
"\n"
" Curl will then accept a wide range of date formats. You always make the date\n"
" check the other way around by prepending it with a dash '-'.\n"
"\n"
"DICT\n"
"\n"
" For fun try\n"
"\n"
" curl dict://dict.org/m:curl\n"
" curl dict://dict.org/d:heisenbug:jargon\n"
" curl dict://dict.org/d:daniel:web1913\n"
"\n"
" Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'\n"
" and 'lookup'. For example,\n"
"\n"
" curl dict://dict.org/find:curl\n"
"\n"
" Commands that break the URL description of the RFC (but not the DICT\n"
" protocol) are\n"
"\n"
" curl dict://dict.org/show:db\n"
" curl dict://dict.org/show:strat\n"
"\n"
" Authentication is still missing (but this is not required by the RFC)\n"
"\n"
"LDAP\n"
"\n"
" If you have installed the OpenLDAP library, curl can take advantage of it\n"
" and offer ldap:// support.\n"
"\n"
" LDAP is a complex thing and writing an LDAP query is not an easy task. I do\n"
" advice you to dig up the syntax description for that elsewhere, RFC 1959 if\n"
" no other place is better.\n"
"\n"
" To show you an example, this is now I can get all people from my local LDAP\n"
" server that has a certain sub-domain in their email address:\n"
"\n"
" curl -B \"ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se\"\n"
"\n"
" If I want the same info in HTML format, I can get it by not using the -B\n"
" (enforce ASCII) flag.\n"
"\n"
"ENVIRONMENT VARIABLES\n"
"\n"
" Curl reads and understands the following environment variables:\n"
"\n"
" HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY\n"
"\n"
" They should be set for protocol-specific proxies. General proxy should be\n"
" set with\n"
" \n"
" ALL_PROXY\n"
"\n"
" A comma-separated list of host names that shouldn't go through any proxy is\n"
" set in (only an asterisk, '*' matches all hosts)\n"
"\n"
" NO_PROXY\n"
"\n"
" If a tail substring of the domain-path for a host matches one of these\n"
" strings, transactions with that node will not be proxied.\n"
"\n"
"\n"
" The usage of the -x/--proxy flag overrides the environment variables.\n"
"\n"
"NETRC\n"
"\n"
" Unix introduced the .netrc concept a long time ago. It is a way for a user\n"
" to specify name and password for commonly visited ftp sites in a file so\n"
" that you don't have to type them in each time you visit those sites. You\n"
" realize this is a big security risk if someone else gets hold of your\n"
" passwords, so therefor most unix programs won't read this file unless it is\n"
" only readable by yourself (curl doesn't care though).\n"
"\n"
" Curl supports .netrc files if told so (using the -n/--netrc option). This is\n"
" not restricted to only ftp, but curl can use it for all protocols where\n"
" authentication is used.\n"
"\n"
" A very simple .netrc file could look something like:\n"
"\n"
" machine curl.haxx.nu login iamdaniel password mysecret\n"
"\n"
"CUSTOM OUTPUT\n"
"\n"
" To better allow script programmers to get to know about the progress of\n"
" curl, the -w/--write-out option was introduced. Using this, you can specify\n"
" what information from the previous transfer you want to extract.\n"
"\n"
" To display the amount of bytes downloaded together with some text and an\n"
" ending newline:\n"
"\n"
" curl -w 'We downloaded %{size_download} bytes\\n' www.download.com\n"
"\n"
"MAILING LIST\n"
"\n"
" We have an open mailing list to discuss curl, its development and things\n"
" relevant to this.\n"
"\n"
" To subscribe, mail curl-request@contactor.se with \"subscribe <your email\n"
" address>\" in the body.\n"
"\n"
" To post to the list, mail curl@contactor.se.\n"
"\n"
" To unsubcribe, mail curl-request@contactor.se with \"unsubscribe <your\n"
" subscribed email address>\" in the body.\n"
"\n"
) ;
}