Performance Testing Fundamentals
Shady Ahmed Mohamed
QA & Test Automation Expert (SDET) at IDEMIA || Performance Test Specialist || QC Software Test Specialist
Performance Testing (Concepts and Types)
Fundamentals of Performance Testing
What is Performance testing?
- Performance Testing is: is a type of Non-Functional Testing.
- Performance Testing is testing that is performed to determine how fast some aspects of a system perform under a specific workload.
- It can serve different purposes like it can demonstrate that the system meets performance criteria.
- It comes to comparing two systems to find which performs better.
- It can measure what part of the system or workload causes the system to perform badly.
Q: Why is the performance test considered a non-functional test? (Choose)
-Because it is used to verify the non-functional requirement.
-Because it is used to verify the functional requirement.
Why Performance Testing?
- Identify problems early before they become costly to resolve
- Produce better quality, more scalable code
- Prevents revenue and credibility loss
- Enables intelligent planning for future expansion
- To ensure the system meets performance expectations such as response time, throughput, etc.
Poor performance effect:
1- Revenue
2- Customers
3- Productivity
Q: What are the main reasons we perform performance tests?(Choose)
- -To determine whether the application satisfies performance requirements (for instance, the system should handle up to 1,000 concurrent users).
- -To locate computing bottlenecks within an application.
- -To establish whether the performance levels claimed by a software vendor are indeed true.
- -To compare two or more systems and identify the one that performs best.
- -All of the above
- -None of the above
Performance Testing Types:
1- Dry Testing (Benchmark testing): the first type of performance testing focus on checking a small number of user and how the application behaves with this number of users.
2- Load Testing: the most popular type of performance testing and focus on the response time of the application according to the number of users who used the application.
-show how the system behaves in a normal case
-show how the system behaves with a high load of users working on the same application
-assume we have 10 users using the same application concurrently with a response time of 10 seconds with 25 users the response time increase to 20 seconds and with 50 users response time increase to 60 seconds so from this point we can decide if this is acceptable for customer and end users or need to enhance application.
3- Stress Testing: it focuses on the error rate after exceeding the maximum number of users who can use the application and get the acceptable response time
-it needs to monitor the Server CPU and Memory during Stress testing to calculate how much we need to increase the Server CPU and Memory to handle this number of users
-check server recovery after stress testing to get back to normal behavior and within which time.
4- Endurance Testing: The longest type of performance testing that may take several hours or days and it focus on checking error rate, and memory leak and consists increase in response time
-It is similar to stress testing except it uses a static number of users who do an on-loop activity over a long period
-It expected after a long period without memory leak to have a consistent response time
-memory leak recognized by getting full memory usage after time of running endurance testing
5- Spike Testing: it’s similar to endurance testing and not popular in the performance testing cycle
-Use unstatic number of users over the spike testing and check error rate, response time, and memory leak
-It depends on changing the number of users running from a maximum supported number of users to a low supported number of users and returning to another number of users and checking the response time
-It expects the same response time when returning to the same number of users
6- Volume Testing: it's a different type of performance testing as its focuses on data and response time
-It focuses on two aspects, the number of data and the size of data.
-Assume we upload 10 images with 2M size for each image with a total of 20M and check the response time of the application.
-Assume we upload 2 images with 10M size for each image with a total of 20M and check the response time of the application.
-These two scenarios can determine if the size of data or the number of data cause increasing response time.
7- Scalability Testing: it's a type of performance testing similar to load and stress testing
-It focuses on the response time of the application while server scaling.
-Assume with a number of users, the server reaches 90% of memory usage and 90% of CPU usage. so the server needs scaling by increasing Memory and CPU.
-While server scaling response time is still high until finished server scaling.
-After scaling response time decreased again with the same number of users.
-Time of scaling should be noted as this is the scalability testing.
Q: Choose the correct answer/s.(Multiple choices)
-Load testing: The number of users over response time, determines if the application can run with multiple users and identify where is the bottleneck functions.
-Scalability testing: number of users over response time but with a higher number of users beyond the normal capacity, to determine how the application doesn’t crash and observe the results.
-Endurance testing: Number of users over response time but for a long period of time 8 hours +, to check for memory leaks and if the system will crash after a long heavy load.
-Spike testing: sudden increase/decrease in the number of users, it's to determine whether the application can handle or not under significant changes in the load.
-Volume testing: Data size or number over response time, to determine how the database is performing with the system, and the functionalities of the system that is related to storing/reading data.
-Stress testing: it is used to determine if the software is effectively handling increasing workloads. This can be determined by gradually adding to the user load or data volume while monitoring system performance. Also, the workload may stay at the same level while resources such as CPUs and memory are changed.
Web Performance Testing Process:
1- Develop the right testing Environment (Planning)
2- Identify the performance Acceptance Criteria (Planning)
3- Plan and Design Performance Tests (Planning)
4- Set up Performance Testing Environment (Development)
5- Test the Design Implementation (Development)
6- Run the Test (Execution)
7- Analyze, Tune, and Retest (Execution)
8- Performance Report (Reporting and closure)
Q: What is the correct arrangement for the performance process?(Choose)
-Develop Testing Environment, Identify the Performance Acceptance Criteria, Plan and Design Performance Tests, Test the Design Implementation, Set Up the Performance Testing Environment, Run the Test, Analyze, Tune, and Retest
-Develop Testing Environment, Identify the Performance Acceptance Criteria, Plan and Design Performance Tests, Set Up the Performance Testing Environment, Test the Design Implementation, Run the Test, Analyze, Tune, and Retest
-Develop a Testing Environment, Identify the Performance Acceptance Criteria, Set Up the Performance Testing Environment, Test the Design Implementation, Run the Test, Analyze, Tune, and Retest.
Tools used in performance testing:
1- Apache JMeter (Free and open source)
2- HP Load Runner (paid)
Why Apache JMeter?
1- It's a free and open-source license
2- Friendly GUI
3- Platform independent
4- Full Multi-Threading Framework
5- Visualize the test result
6- Easy Installation
7- Highly Extensible
8- Unlimited Testing capabilities
9- Support Multi-Protocols
How does JMeter Work?
-JMeter sends requests to a target server by simulating a group of users and then collects data to calculate statistics and display performance metrics through various formats.
Book for Performance Testing
https://drive.google.com/file/d/1VnC4xF57b2BuXEgTzLI5QUv-_gYUc56_/view?usp=share_link
IT Architecture Manager at IDEMIA
1 年Nicely articulated Shady, well done