Automating Distributed Performance Testing with a Bash Script: A Step-by-Step Guide

Automating Distributed Performance Testing with a Bash Script: A Step-by-Step Guide

In the ever-evolving world of software development, ensuring the efficiency and reliability of applications under varying loads is critical. Performance testing plays a crucial role in this, helping teams measure system behavior under stress and identify potential bottlenecks before they affect users.

Automating performance testing through scripting, particularly using tools like Bash, offers several advantages. It enables repeatable, consistent test execution, saving time and reducing human error. This article delves into an automated performance testing approach using a Bash script.

Whether you're a seasoned QA engineer or someone new to performance testing, understanding and leveraging automation will help you optimize testing efficiency and provide deeper insights into system performance.


1. Creating the Header for the Output File:

echo "timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,bytes,grpThreads,allThreads,Latency,SampleCount,ErrorCount" > result.jtl        

  • Explanation: This line initializes a .jtl file (result.jtl) to store the performance test results. The file’s header includes various columns like timeStamp, elapsed, label, etc., essential for JMeter to generate reports.


2. Setting the Path to JMeter:

JMETER_PATH="/path to /bin/jmeter"        

  • Explanation: This line sets the path to the JMeter executable. It’s crucial to ensure that the JMeter path is correct, as it will be used later to generate the HTML report.


3. Defining Iterations and Batch Size:

iterations=2
batch_size=2
start_user=9840XXXXXX        

  • Explanation: These variables define the number of iterations, the batch size for concurrent user simulations, and the starting user ID. Adjust these values based on your testing needs.


4. Starting the Main Loop for Iterations:

for ((iteration=1; iteration<=iterations; iteration++)); do        

  • Explanation: This loop runs the entire process for the specified number of iterations. Each iteration simulates a batch of users performing login, booking, and confirmation.


5. Inner Loop for User Simulation:

for ((j=0; j<batch_size; j++)); do        

  • Explanation: This nested loop iterates through the batch size, simulating multiple users concurrently.


6. Performing Login:

start_time=$(date +%s%3N) login_response=$(curl --location 'BASE_URL_LOGIN' \        

  • Explanation: This segment performs a login request for each user using curl. It captures the start time and the response from the server.

token=$(echo $login_response | grep -o '"token":"[^"]*' | cut -d'"' -f4)        

  • Explanation: The token is extracted from the login response, which is necessary for subsequent requests (booking and confirmation).


7. Logging the Login Result:

elapsed_time=$(($end_time - $start_time)) echo "$start_time,$elapsed_time,Login,$response_code,OK,Thread-$j,text,$success,$bytes,$batch_size,$batch_size,$latency,1,0" >> result.jtl        

  • Explanation: The elapsed time and other metrics are logged into the .jtl file, which will later be used to generate the HTML report.


8. Booking Request:

book_response=$(curl --location 'FULL_URL' \        

  • Explanation: This segment simulates booking request, using the token obtained from the login step. The response is stored for logging.


9. Logging the Booking Result:

echo "$start_time,$elapsed_time,Book,$response_code,OK,Thread-$j,text,$success,$bytes,$batch_size,$batch_size,$latency,1,0" >> result.jtl        

  • Explanation: Similar to the login, this logs the booking details into the .jtl file.


10. Confirmation:

confirm_response=$(curl --location 'FULL_URL' \        

  • Explanation: This step simulates confirmation, using the booking ID from the previous step.


11. Logging the Confirmation Result:

echo "$start_time,$elapsed_time,Confirm,$response_code,OK,Thread-$j,text,$success,$bytes,$batch_size,$batch_size,$latency,1,0" >> result.jtl        

  • Explanation: The result of the confirmation step is logged, completing the cycle for one user.


12. Generating the HTML Report:

$JMETER_PATH -g result.jtl -o /c/apache-jmeter-5.6.3/path to output folder        

  • Explanation: This command uses JMeter to generate an HTML report from the .jtl file, summarizing the performance test results.


Check below image for full script from Login, booking to Confirmation:

Example 1


Example 2

Follow For More !! Thanks


Dmitri T

Quality Assurance Engineer at EPAM Systems

6 个月

Why do you need to use Bash if you're already using JMeter? Using JMeter's HTTP Request sampler (https://jmeter.apache.org/usermanual/component_reference.html#HTTP_Request) provides more control over how the test is being executed, you can configure JMeter to behave like a real browser (https://portal.perforce.com/s/article/How-to-make-JMeter-behave-more-like-a-real-browser-1707509382226) when it comes to downloading embedded resources, respecting Cache-Control headers, sending headers like User-Agent, performing authentication, recording protocol-based metrics like connect time and latency, using client-side certificates, etc.

要查看或添加评论,请登录

Rojan Uprety的更多文章

社区洞察

其他会员也浏览了