Overview

The cURL program is widely available across many different platforms, which makes it an obvious choice for network testing. It is simple, scriptable, and flexible – which is why it is so powerful. It supports many protocols, but we are going to focus on HTTP in this article.
The basic syntax for a cURL command is pretty straightforward – just add the destination URL:

$ curl http://google.com

For this simple command, curl will return the result. That usually means a bunch of HTML will be sent to your console. For the example command above, we get the following:

$ curl http://google.com
301 Moved
<h1>301 Moved</h1>
The document has moved
<a href="http://www.google.com/">here</a>.

Remember, curl is not a browser, so by default it doesn’t follow redirects. It simply executes the single command that you gave it (in this case, an HTTP GET). You can output the request headers by adding a -i flag to your command:

$ curl -i http://google.com
HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Thu, 10 Aug 2017 23:29:44 GMT
Expires: Sat, 09 Sep 2017 23:29:44 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

301 Moved
<h1>301 Moved</h1>
The document has moved
<a href=”http://www.google.com/”>here</a>.

If you do want curl to follow the redirect, just add the -L parameter.

Now, this may be interesting for one-off manual tests, but probably not for automated testing. Fortunately, curl allows us to customize and format the command output.

Output Formatting

Curl has a -w flag which tells curl to output a certain string of information after the transfer has completed. Here is a list of available variables:

  • content_type
  • filename_effective
  • ftp_entry_path
  • http_code
  • http_connect
  • http_version
  • local_ip
  • local_port
  • num_connects
  • num_redirects
  • proxy_ssl_verify_result
  • redirect_url
  • remote_ip
  • remote_port
  • scheme
  • size_download
  • size_header
  • size_request
  • size_upload
  • speed_download
  • speed_upload
  • ssl_verify_result
  • time_appconnect
  • time_connect
  • time_namelookup
  • time_pretransfer
  • time_redirect
  • time_starttransfer
  • time_total
  • url_effective

The man page for curl contains more detailed information about each variable (including units, etc.).

If we add a few output variables to our original example, we get the following:

$ curl http://google.com -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n"
301 Moved
<h1>301 Moved</h1>
The document has moved
<a href="http://www.google.com/">here</a>.
0.011,0.047,4657.000,301,219,http://google.com/

Notice the -w parameter allows us to add in additional characters beyond simply the provided variables. This means we can have a nicely formatted CSV output at the end of our command.

But in a performance script, we probably wouldn’t want the actual page content. For that, we can add a -o flag and send the output to /dev/null (a.k.a., oblivion). Observe:

$ curl http://google.com -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 219 100 219 0 0 1256 0 --:--:-- --:--:-- --:--:-- 1258
0.140,0.174,1256.000,301,219,http://google.com/

Wait! That got rid of the content, but it replaced it with a progress table. That table is very useful when you are watching a long-running transfer. But not helpful for our automated scripting scenario. Fortunately, the -s (“silent”) option tells curl to keep that progress to itself.

$ curl http://google.com -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null -s
0.202,0.237,924.000,301,219,http://google.com/

Perfect! Now we can append that output onto our results file:

$ curl http://google.com -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null -s >> myResults.csv

Dynamic URLs

While single URLs can be useful, more often we have groups of URLs we want to test. Luckily, cURL has some built-in methods for providing multiple variations into a single command. Using brackets [] for ranges, and curly braces {} for sets, we can tell curl to do some interesting things. For example:

$ curl https://www.google.com/search?q=[1985-1990] -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null -s
0.073,1.155,5008.000,403,5785,https://www.google.com/search?q=1985
0.000,1.045,5535.000,403,5785,https://www.google.com/search?q=1986
0.000,1.043,5548.000,403,5785,https://www.google.com/search?q=1987
0.000,1.044,5541.000,403,5785,https://www.google.com/search?q=1988
0.000,1.084,5336.000,403,5785,https://www.google.com/search?q=1989
0.000,1.285,4488.000,403,5768,https://www.google.com/search?q=1990

We just searched for every year from 1985 – 1990 with a single curl command. By specifying this range in the URL, curl simply goes through each value one at a time. We can also use brackets to create a list of queries:

$ curl "http://www.google.com/search?q={jurassic%20park,jumanji,armageddon}" -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null -s
0.007,1.237,4367.000,403,5404,http://www.google.com/search?q=jurassic%20park
0.000,1.043,5161.000,403,5384,http://www.google.com/search?q=jumanji
0.000,1.107,4865.000,403,5387,http://www.google.com/search?q=armageddon

At this point, you may have noticed that we are getting 403 responses from Google. That’s because we don’t have a user agent, since we aren’t using a browser. If we add the –user-agent option, we get 200s instead:

$ curl "http://www.google.com/search?q={jurassic%20park,jumanji,armageddon}" -w "%{time_connect},%{time_total},%{speed_download},%{http_code},%{size_download},%{url_effective}\n" -o /dev/null -s --user-agent Tutorial
0.070,0.896,73551.000,200,65922,http://www.google.com/search?q=jurassic%20park
0.000,0.612,99314.000,200,60828,http://www.google.com/search?q=jumanji
0.000,0.615,99508.000,200,61172,http://www.google.com/search?q=armageddon

Note that in most cases, you’ll probably want to use a real user agent string, but this works for our purposes.

Uploads

So far we’ve only looked at downloads. What about uploads? The main difference is you need to specify a file to upload. This is done with the -F parameter. And of course, your URL needs to be one that accepts uploads. Let’s look at an example:

$ curl -F file=@test.file http://mytestserver.net/upload.php
{"status":"OK","message":"file uploaded","$_FILES":{"file":{"name":"test.file","type":"application\/octet-stream","tmp_name":"\/tmp\/phpMHNoBs","error":0,"size":22}}}

Note that the output you get will vary depending on what page you are hitting. In this case, we got a json object with some details about the upload. If we don’t want this output, we can go back to our output variables from earlier and get a nice CSV output (don’t forget to use speed_upload and size_upload now):

$ curl -F file=@test.file http://mytestserver.net/upload.php -w "%{time_connect},%{time_total},%{speed_upload},%{http_code},%{size_upload},%{url_effective}\n" -o "/dev/null" -s
0.024,0.066,3376.000,200,223,http://mytestserver.net/upload.php

Voila! We have uploaded a file with curl. But before we move on, just a quick note on -F : this emulates a form submission. We specified “file” as the name of the form-field we were filling with test.file. If it doesn’t match the form-field on the page, you will likely get errors. Make sure to read the man page for more details on form submission.

Timeouts

Sometimes transfers take a long time. Sometimes servers are unavailable. Default timeouts are often as high as 5 minutes. Fortunately, we can specify our own timeout values for curl to follow.

$ curl -F file=@server1.pcap.gz http://mytestserver.net/upload.php -w "%{time_connect},%{time_total},%{speed_upload},%{http_code},%{size_upload},%{url_effective}\n" -o "/dev/null" -s --connect-timeout 15 --max-time 30
0.154,0.719,7815804.000,200,5616984,http://mytestserver.net/upload.php

This is useful when you are dealing with larger files, or when you’d rather just timeout and move on to the next test. The two parameters above, –connect-timeout and –max-time, are quite useful. However, there are other time related parameters as well, such as:

--expect100-timeout
--keepalive-time
--retry-delay
--retry-max-time

More details about each of these parameters can be found on the man page.

Click here for part 2, where we go through some sample test scripts!