cURL Cheat Sheet

Curl is the swiss army knife for gathering information while troubleshooting websites. There are many ways to use curl and this guide is simply documenting the ones I often use, but can never seem to remember the specific flags.

General Usage

Test a page behind a username/password prompt:

[user@workstation01 ~]# curl --user name:password http://www.example.com

Download files from github:

[user@workstation01 ~]# curl -O https://raw.github.com/username/reponame/master/filename

Download content via curl to a specific filename:

[user@workstation01 ~]# curl -o archive.tgz https://www.example.com/file.tgz

Run a script from a remote source on server. Understand the security implications of doing this as it can be dangerous!

[user@workstation01 ~]# source <( curl -sL https://www.example.com/your_script.sh)

Have curl make the connection through a specific interface on the server:

[user@workstation01 ~]# curl --interface bond0 icanhazip.com

Website Troubleshooting

Receive output detailing the HTTP response and request headers, following all redirects, and discarding the page body:

[user@workstation01 ~]# curl -Lsvo /dev/null https://www.example.com

Test a domain hosted on the local server, bypassing DNS:

[user@workstation01 ~]# curl -sLH "host: www.example.com" localhost

Test a domain against specific IP, bypassing the need for modifying /etc/hosts:

[user@workstation01 ~]# curl -IL https://www.example.com --resolve www.example.com:443:123.123.123.123

or 

[user@workstation01 ~]# curl -Lsvo /dev/null --resolve 'example.com:443:123.123.123.123' https://www.example.com/

or

[user@workstation01 ~]# curl -Lsvo /dev/null --header "Host: example.com" https://123.123.123.123/

Send request using a specific user-agent. Sometimes a server will have rules in place to block the default curl user-agent string. Or perhaps you need to test using a specific user-agent. You can pass specific user-agent strings by running:

[user@workstation01 ~]# curl --user-agent "USER_AGENT_STRING_HERE" www.example.com

A comprehensive listing of available user-agent strings available resides at:
http://www.useragentstring.com/pages/useragentstring.php

For example, lets say you need to test a mobile device user-agent to see if a custom redirect works:

[user@workstation01 ~]# curl -H "User-Agent: Mozilla/5.0 (Linux; Android 4.0.4; Galaxy Nexus Build/IMM76B) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.133 Mobile Safari/535.19" -IL http://www.example.com/about/contact-us

HTTP/1.1 301 Moved Permanently
Date: Tue, 17 Nov 2015 18:10:09 GMT
Server: Apache/2.4.6 (CentOS) OpenSSL/1.0.1e-fips
Location: http://www.example.com/mobi/contact.php
Content-Type: text/html; charset=iso-8859-1

SSL/TLS Testing

Test to see if the site supports specific SSL/TLS protocols:

[user@workstation01 ~]# curl --sslv2 https://www.example.com
[user@workstation01 ~]# curl --sslv3 https://www.example.com
[user@workstation01 ~]# curl --tlsv1 https://www.example.com
[user@workstation01 ~]# curl --tlsv1.0 https://www.example.com
[user@workstation01 ~]# curl --tlsv1.1 https://www.example.com
[user@workstation01 ~]# curl --tlsv1.2 https://www.example.com

Performance Troubleshooting

Load times can be impacted by a number of things, such as the TLS handshake, DNS lookup time, redirects, transfers, upload/downloads, etc. The curl command shown below will break down the times for each accordingly:

[user@workstation01 ~]# curl -Lsvo /dev/null https://www.example.com/ -w "\nContent Type: %{content_type} \
\nHTTP Code: %{http_code} \
\nHTTP Connect:%{http_connect} \
\nNumber Connects: %{num_connects} \
\nNumber Redirects: %{num_redirects} \
\nRedirect URL: %{redirect_url} \
\nSize Download: %{size_download} \
\nSize Upload: %{size_upload} \
\nSSL Verify: %{ssl_verify_result} \
\nTime Handshake: %{time_appconnect} \
\nTime Connect: %{time_connect} \
\nName Lookup Time: %{time_namelookup} \
\nTime Pretransfer: %{time_pretransfer} \
\nTime Redirect: %{time_redirect} \
\nTime Start Transfer (TTFB): %{time_starttransfer} \
\nTime Total: %{time_total} \
\nEffective URL: %{url_effective}\n" 2>&1

The example output is below:

...
HTTP Code: 200 
HTTP Connect:000 
Number Connects: 2 
Number Redirects: 1 
Redirect URL:  
Size Download: 136112 
Size Upload: 0 
SSL Verify: 0 
Time Handshake: 0.689 
Time Connect: 0.556 
Name Lookup Time: 0.528 
Time Pretransfer: 0.689 
Time Redirect: 0.121 
Time Start Transfer (TTFB): 0.738 
Time Total: 0.962 
Effective URL: https://www.example.com/

Another example for quickly seeing performance is below. For simplicity purposes, first create a file with our curl options already in it:

[user@workstation01 ~]# vim site-performance.cfg
\n
      DNS lookup                          :  %{time_namelookup}\n
      Connect to server (TCP)             :  %{time_connect}\n
      Connect to server (HTTP/S)          :  %{time_appconnect}\n
      Time from start until transfer began:  %{time_pretransfer}\n
      Time for redirection (if any)       :  %{time_redirect}\n
      Total time before transfer started  :  %{time_starttransfer}\n
\n
             Total time                   :  %{time_total}\n
             Size of download (bytes)     :  %{size_download}\n
             Average d/l speed (bytes/s)  :  %{speed_download}\n
\n

Then run it:

[user@workstation01 ~]# curl -w "@site-performance.cfg" -o /dev/null -s https://www.example.com
      DNS lookup                          :  0.138664
      Connect to server (TCP)             :  0.171131
      Connect to server (HTTP/S)          :  0.268969
      Time from start until transfer began:  0.269021
      Time for redirection (if any)       :  0.000000
      Total time before transfer started  :  0.532772

             Total time                   :  0.628730
             Size of download (bytes)     :  162510
             Average d/l speed (bytes/s)  :  258473.000

If a web server is running a bunch of sites and has a high load, how can you narrow down which site is likely causing the high load condition on the server? One way would be to see which site takes the longest to load, as that may indicate a resource intensive site. See the example below:

[user@workstation01 ~]# for i in www.example1.com www.example2.com www.example3.com; do echo -n "$i "; (time curl -IL $i -XGET) 2>&1 | grep -E "real|HTTP"; echo; done

www.example1.com HTTP/1.1 200 OK
real	0m0.642s

www.example2.com HTTP/1.1 200 OK
real	0m2.234s

www.example3.com HTTP/1.1 200 OK
real	0m0.421s

So www.example2.com takes 2 seconds to load. What happens to the load times on that domain during increased traffic. The example below will send 25 requests to the domain:

[user@workstation01 ~]# for i in {1..25}; do (time curl -sIL http://www.example2.com -X GET &) 2>&1 | grep -E "real|HTTP" & done

HTTP/1.1 200 OK
real	0m11.297s
HTTP/1.1 200 OK
real	0m11.395s
HTTP/1.1 200 OK
real	0m11.906s
HTTP/1.1 200 OK
real	0m12.079s
...
HTTP/1.1 200 OK
real	0m11.297s

Determining why this is happening will involve investigation outside the scope of this article. Mainly around investigating ways to cache the site or otherwise optimizing it. However at least now we know which site doesn’t perform well under increased requests, and may be causing the high server load.