NodeJS is a platform for building applications using JavaScript. JavaScript is single threaded and has callback architecture. How about using this for load testing a Web server? Often we are limited by the tools and their feature set. If you are familiar with NodeJs, here is my quick solution for customized load testing.
Background
Async is one of the most popular node module for code organization in JavaScript otherwise you end up in callback hell. Let’s create a nodejs project and add async as one of the dependencies.
1 2 3 4 |
|
package.json:
1 2 3 4 5 6 7 8 |
|
Making HTTP requests in NodeJS
Let’s make our HTTP request:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
|
If you are not interested in response data, you can skip res.on('data',..)
event and just have end
event for callback.
That’s how simple is to make HTTP GET requests. For our usecase we needed to fire a test only after user logs in. Hence, I hacked the user cookie from Web Browser and passed in request options. You can achieve further customizations using HTTP Headers (accepts language, cookie, caching control, connections alive, etc.).
1 2 3 4 5 6 7 |
|
Firing parallel request
The async library comes up with asynchronous control flow patterns like serial, each, parallel, waterfall, each, eachLimit, etc. For load testing one particular we are interested is eachLimit
with limit.
1
|
|
The eachLimit
executes the iterator
function in parallel for each of the items in the array. However it will ensure not more than limit
parallel iterators running. Let’s assume we have a collection of parameters and each parameter represent independent argument for request to the web server.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
|
Store this code in app.js and run as follows:
1
|
|
Start your test with different values of PARALLEL_N = 5, 10, 25, 50, 75, 100, 200…
Validating Results
Ensure following points to prevent skewed results.
- The above example is a small demonstration with just 6
paramsArray
. Typically you should do testing by making at least100*PARALLEL_N
requests. - There is startup overhead where you should skip initial
PARALLEL_N
which are responsible for warming up the load. - If the HTTP responses are big, keep an eye on network bandwidth. Network bandwidth can also be a bottleneck to skew results.
Author: @geekpack, Ankit Jain