Using Apache Bench for Simple Load Testing

applecoldfusionlinuxweb

If you have access to a Mac or Linux server, chances are you may already have a really simple http load generating tool installed called Apache Bench, or ab. If you are on windows and have Apache installed, you may also have ab.exe in your apache/bin folder.

Suppose we want to see how fast Yahoo can handle 100 requests, with a maximum of 10 requests running concurrently:

ab -n 100 -c 10 http://www.yahoo.com/

It will then generate output as follows:

Concurrency Level:      10
Time taken for tests:   1.889 seconds
Complete requests:      100
Failed requests:        0
Write errors:           0
Total transferred:      1003100 bytes
HTML transferred:       949000 bytes
Requests per second:    52.94 [#/sec] (mean)
Time per request:       188.883 [ms] (mean)
Time per request:       18.888 [ms] (mean, across all concurrent requests)
Transfer rate:          518.62 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:       57   59   1.7     59      64
Processing:   117  126   7.5    124     162
Waiting:       57   62   7.0     60      98
Total:        175  186   8.0    184     224

Percentage of the requests served within a certain time (ms)
  50%    184
  66%    186
  75%    187
  80%    188
  90%    192
  95%    203
  98%    216
  99%    224
 100%    224 (longest request)

As you can see this is very useful information, it returned requests at a rate of 52.94 requests per second, the fastest request was 175ms, the slowest 224ms

So the next time you are tempted to whip out cfloop and GetTickCount to do some benchmarking on a piece of code, give ab a try, it's easy to use, and will yield much more realistic results.

Because ab supports concurrency, this has two big advantages over cfloop. The main one is that it allows you to test how your code runs concurrently, this can help you identify any possible race conditions, or locking issues. Concurrent requests are also a more natural simulation of load than loops.

Suppose you wanted to test multiple url's concurrently as well? You can do this by creating a shell script, with multiple ab calls. At the end of each line place an & this makes the command run in the background, and lets the next command start execution. You will also want to redirect the output to a file for each url using > filename For example:

#!/bin/sh

ab -n 100 -c 10 http://127.0.0.1:8300/test.cfm > test1.txt &
ab -n 100 -c 10 http://127.0.0.1:8300/scribble.cfm > test2.txt &

The usage info from the ab version installed on my Mac (v2.3) is listed below. As you can see there are many useful options for outputting results, and sending additional data in the request.

Usage: ab [options] [http[s]://]hostname[:port]/path
Options are:
    -n requests     Number of requests to perform
    -c concurrency  Number of multiple requests to make
    -t timelimit    Seconds to max. wait for responses
    -b windowsize   Size of TCP send/receive buffer, in bytes
    -p postfile     File containing data to POST. Remember also to set -T
    -T content-type Content-type header for POSTing, eg.
        'application/x-www-form-urlencoded'
        Default is 'text/plain'
    -v verbosity    How much troubleshooting info to print
    -w              Print out results in HTML tables
    -i              Use HEAD instead of GET
    -x attributes   String to insert as table attributes
    -y attributes   String to insert as tr attributes
    -z attributes   String to insert as td or th attributes
    -C attribute    Add cookie, eg. 'Apache=1234. (repeatable)
    -H attribute    Add Arbitrary header line, eg. 'Accept-Encoding: gzip'
        Inserted after all normal header lines. (repeatable)
    -A attribute    Add Basic WWW Authentication, the attributes
        are a colon separated username and password.
    -P attribute    Add Basic Proxy Authentication, the attributes
        are a colon separated username and password.
    -X proxy:port   Proxyserver and port number to use
    -V              Print version number and exit
    -k              Use HTTP KeepAlive feature
    -d              Do not show percentiles served table.
    -S              Do not show confidence estimators and warnings.
    -g filename     Output collected data to gnuplot format file.
    -e filename     Output CSV file with percentages served
    -r              Don't exit on socket receive errors.
    -h              Display usage information (this message)
    -Z ciphersuite  Specify SSL/TLS cipher suite (See openssl ciphers)
    -f protocol     Specify SSL/TLS protocol (SSL2, SSL3, TLS1, or ALL)


Related Entries

16 people found this page useful, what do you think?

 Download FuseGuard WAF for ColdFusion

Trackbacks

Trackback Address: 689/87FF146FE6A23E6E75C855A6CAA8D7A8

Comments

On 02/05/2009 at 6:00:15 PM EST Jacob Torrey wrote:
1
Pete, I like your suggestion, by your method is completely incorrect. First, benchmarking Yahoo.com is useless for detecting their performance, as there are too many other variables to take into account. Second, your scripts will not gather correct result for two reasons: 1). You are benchmarking localhost, that will spike the load on the box, making the performance less accurate. and 2). You are running multiple benchmarks concurrently, which will also skew your results.

You should always run your benchmarks from another machine, and run one at a time to isolate performance, all you're doing right now it wasting electricity and getting useless results.

On 02/06/2009 at 10:40:05 AM EST Pete Freitag wrote:
2
@Jacob - Thanks for your comments. I choose yahoo.com as the first example, simply because I wanted the reader to be able to test out the tool using that example command without changing it. I am sure most of the time was spent on the network, rather than at the server.

I agree running tests from localhost does skew the results, I probably should have mentioned that in the entry.

I do think there are cases where it makes sense to run multiple tests concurrently. Most load tools allow you to run through several url's concurrently, I was simply emulating that with the shell script.

On 06/21/2009 at 10:32:37 PM EDT varsee wrote:
3
hello pete, while i was trying to use the ab cmd,i could not get my ip addr either by ipconfig or /sbin/ifconfig -a the problem is i get different ip addr's.. how should i check the performance?

On 10/22/2009 at 7:06:58 PM EDT Sad45 wrote:
4
Right now, it's in an archive format. ,

On 02/08/2011 at 6:34:28 AM EST vic wrote:
5
Hi @ all, Do you know something about testing https websites? It seems that 'ab' should do this but when I try it is clearly wrote 'SSL Not compiled in; no https support' that means not ^^ What could I do ? Thank you, the topic is nice anyway :)

On 09/26/2011 at 3:20:31 PM EDT Carlton Dickson wrote:
6
Vic, if you're running apache on a windows machine there's a good chance there's something called "abs.exe" in the same folder as "ab.exe" that should work exactly the same as ab.exe but have support for SSL requests :)

On 08/25/2012 at 7:30:28 PM EDT viren wrote:
7
hello pete, thanks for your article. i'm a newbie when it comes to web stress testing (i'm a mobile systems engineer so only used to writing code for mobile phones lol) and your article proved very useful in testing my first nginx-fastcgi++ server setup. thanks!

Post a Comment




  



Spell Checker by Foundeo

Recent Entries



foundeo


did you hack my cf?