As a tester/developer, you might run into a situation where you need to make sure the performance of an API your team is developing is up to the expectations. This can help identifying API response time, how much concurrent users it can support, user load at which it breaks, CPU/Memory usage, Memory leaks.

J Meter is an open source performance testing tool which has many capabilities beyond just API testing. Scope of this blog is to elaborate how this tool can be used to create a small framework to test API for their performance.

Installation

  1. Download JMeter from http://jmeter.apache.org/download_jmeter.cgi you would just want the binaries and select the Zip format. Extract the contents to a folder. Locate ApacheJMeter.jar from the bin folder which is the executable for JMeter.
  2. Open JMeter and Navigate to Options -> Plugin Manager to install additional plug ins. Download 5 Additional Graphs, PerfMon, JSON Plugins anything else you may want to add in addition to basic functions
  3. Download server agent from https://jmeter-plugins.org/wiki/PerfMon/”>https://jmeter-plugins.org/wiki/PerfMon// this will help us monitor the counters (CPU, Memory, Networks etc.). Locate and run startAgent.bat on the server you want the counters to be monitored.

Preparations

  1. Once you have finished the installations create a new test plan by clicking File -> New
  2. Right click on Test Plan and add a Thread group, thread group will help you manage the pattern of the load. Like how many users you want concurrently start (Number of threads) and how they should be started (ramp-up), Loop count (select forever if you are soak testing for monitoring memory leak). Under each of this tread group we lay out our test requests. Which follow later in this blog
  3. You may have multiple thread groups as well, this will enable you to simulate more real scenario where multiple API’s are being called by different number of users. Make sure you enable ‘Run Thread Groups consecutively’ option under Test Plan to be able to start all thread groups together. You may right click on any test element and disable if you want to ignore that for the test run as well.

  1. Under the thread group you may add below test elements

a. Http Header Manager (preset headers to be sent on all requests are managed by this).  You can default to accept */*;Content-Type application/json;charset=UTF-8 for API testing

b. Http Request Defaults (This is where you can configure your webserver and port)

c. CSV Data Set Config (This will help if you must provide parameterized data to your API optionally). In below sample, I have sample data in a CSV file named Workgever_Werknemer.csv. This should be saved in the same folder as your jmx file. You also provide how the data in the csv is arranged e.g.: werkgever, werknemer. These can be accessed via variables ${ werkgever } and ${ werknemer} respectively. Test would cycle through the data from the CSV after this is done.

d. Http Request. You can rename this to a different name like I have done here for more readability (Personal Details). Under this control you will configure the API, you don’t have to provide the host and port which you have set in the HTTP Request Defaults already. You can also see how the parameterized data from CSV data Set Config is used as well.

e. Add Response Assertion if you would like. You can match for patterns in the response from Http Request. Here I am asserting the parameter from the CSV to be present in the response.f. You may now want to add some Listeners to view the results. This will help monitor and view the results in a nice format. I have added View results tree, Aggregate Graph, PerfMon Matric Collector and Response time test elements. For each of these you have option to save the results to a file. In PerfMon Matric collector you should provide the host IP address and port where you are collecting server side counters using the agent (step 5 of installation)

Testing

After saving the solution you can start the test by clicking the Green Play button on top.

This will start all the thread groups (if you have configured so) and the results can be visible on the Listener Test Elements. E.g.: Below


I have configured to save the results to an Excel from the Response time listener, the results in excel looks like below

PerfMon depending on the matrices you are collecting will show the results in a graph.

Extra Bits

If you have interlinked APIs, for example login request which returns a token, which you want to include in other requests. In this case you can use Json Path Extractor test element (Refer: https://jmeter-plugins.org/wiki/JSONPathExtractor/?utm_source=jmeter&utm_medium=helplink&utm_campaign=JSONPathExtractor) to extract the token from the first request and assign it to a variable which can be used further on. This makes it possible to create a workflow which can be load tested.

Previous articleAgile Developer Wanted: What to look for
Next articleGetting started with gulp
I had became fascinated with Agile Development quite long time ago, because of its speed and transparency. Innovations in agile testing space to efficiently deliver quality products, reducing testing time by effective automation, process improvements and risk mitigation are my focus areas. Apart from that, I have interest in DevOps and Performance Testing. I am passionate to help teams to be high performing.

5 COMMENTS

    • Hi Carl, apologies for the delay in response. Answer to the question is Yes, If your requests are going have parameterized data (eg: username field which automatically increments every iteration) for the fields which your application is expecting to be unique (eg: username), you can create 1000 users.

      However If your need was only to create a new single user to perform the load test every time, you can achieve that by keeping the post call to create a user outside the thread group which performs your main load test. You can use parameters to share the user details to main thread.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

*