Swagger + Excel Sheets, a wonderful way of validating REST APIs
Dheeraj Kumar Aggarwal
Helping businesses solve API and Traceability problems | Product Evangelist | Atlassian Expert | API Automation Expert | Creator of vREST NG
Swagger Files (aka OpenAPI Specification) is the most popular way for documenting API specifications and Excel sheet provides an easy and simple way of writing structured data. Anybody can write data in excel sheet irrespective of their programming skills. Introducing vREST NG (An enterprise ready application for Automated API Testing), which combines the power of both to make your API Testing experience more seamless. The approach is also known as Data Driven Testing.
Data Driven testing is an approach in which test data is written separately from the test logic or script.
So, this is how the process looks like:
vREST NG uses swagger files to generate all of the test logic and sample test data CSV files. vREST NG reads test data from the CSV files and iterate over the rows available in the CSV files and run the iterations one by one. Today in this post, we will look at the following in detail:
- How you may generate the test cases by using the swagger files.
- How you may feed the test data to those generated test cases through an excel sheet.
How to perform Data Driven API Testing in vREST NG
To elaborate the process, I will take a sample test application named as contacts application which provides the CRUD APIs. I will guide you through the following steps:
- Setup the test application
- Download and Install vREST NG Application
- Perform Data Driven API Testing in vREST NG
1. Setup the Test Application:
You may skip this step if you want to follow the instructions for your own test application.
Otherwise, just download the sample Test Application from this repository link. This application is a NodeJS based application and tested with NodeJS v10.16.2.
To setup this application, simply follow the instructions mentioned in the README file of the repository.
2. Download and Install vREST NG Application
Now, simply download the application through vREST NG website and install it. Installation is simpler but if you need OS specific instructions, then you may follow this guide link.
After installation, start the vREST NG Application and use vREST NG Pro version when prompted in order to proceed further.
Now first setup a project by dragging any empty directory from your file system in the vREST NG workspace area. vREST NG will treat it as a project and store all the tests in that directory. For more information on setting up project, please read this guide link.
For quick start, if you don't want to follow the whole process and just want to see the end result. They may download and add this project directory in vREST NG application directly.
3. Performing Data Driven API Testing in vREST NG
vREST NG provides a quick 3 step process to perform data driven API Testing:
(a) Import the Swagger File
(b) Write Test Data in CSV Files
(c) Setup Environment
Now, we will see these steps in detail:
(a) Import the Swagger File
To import the Swagger file, simply click on the Importer button available in the top left corner of the vREST NG Application.
An import dialog window will open. In this dialog window:
- Select "Swagger" as Import Source
- Tick the option `Generate Data Driven Tests`. If this option is ticked then vREST NG Importer will generate the data driven test cases for each API spec available in the swagger file.
- Provide the swagger file. For this demonstration, I will use the swagger file from the test application repository. Download Swagger File
The dialog window will look something like this. Now, click on the Import button to proceed further.
The import process has done the following things so far:
1. It has generated a test case for each API spec available in the swagger or OpenAPI file. And test suites will be generated against each tag available in the swagger file.
2. It has automatically created the sample CSV files against each test case with desired columns according to your swagger file as shown in the following image.
We will discuss in detail on how you may fill this excel sheet later in this post.
3. The generated CSV files are also automatically linked as shown in the following image.
So, before every test execution, the API test will read the data from the linked CSV file and converts it into JSON format and store it in a variable named as data. Now the test case will iterate over the data received and run the iterations. So, if you make a change in CSV file, just run the test case again. Test Case will always pick up the latest state of the CSV file. No need to import again and again.
4. It has automatically inserted some variables in the API request params as per the API definitions available in the swagger file. These variables value will picked up from the linked CSV file automatically.
5. It has automatically added the response validation logic as well. Status code assertion is used to validate the status code of the API response. Text Body with Default Validator assertion compares the expected response body with the actual response body. Text body with Default Schema Validator assertion validates the API response through the JSON schema.
The expected status code will be picked up from the linked CSV file.
And the expected response body will also be picked up from the linked CSV file.
And the expected schema name is also picked up from the linked CSV file.
6. It has imported all the swagger schema definitions in the Schemas section available in the Configuration tab.
You may refer these schema definitions in the Expected Schema tab as discussed earlier. And in the CSV file, you will need to just specify the respective schema name for the test iterations in the expectedSchema column.
(b) Write Test Data in CSV Files
As we have already seen the data file generated from the import process. Let me show you the generated file again for the Create Contact API:
In this sample file, you may add test data related to various iterations for the Create Contact API. In the iterationSummary column, simply provide the meaningful summary for your iterations. This iteration summary will show up in the Results tab of the vREST NG Application. You will need to fill this test data by yourself. You may even generate this test data through any external script.
Now, let's add some test iterations in the linked CSV file.
With the above CSV file, we are checking two test conditions of our Create Contact API:
- When the name field is empty
- And when the name field length is greater than the limit of 35 characters.
In the above CSV file, we have intentionally left the expectedBody column blank. We don't need to fill this column. We can fill this column's value via the vREST NG Application itself.
Before executing the test case, we need to configure the baseURL variable of your test application in the Configuration tab like this:
Now, let's execute this test in vREST NG Application. Both the iterations are failed because expected response body doesn't match with the actual response body as shown in the following image:
Now, click on button "Copy Actual to Expected" for each iteration. vREST NG will directly copy the actual response body to expectedBody column in the CSV file like this.
Now after this operation, if you look at the CSV file again. You can see that vREST NG has filled the expectedBody column for you as shown in the following image.
Note: If you have opened this CSV file in Microsoft Excel then you will need to close the file and open it again in order to reflect the changes. But some code editors automatically detect the changes on the file system and reflect it in real time.
Now, if you execute the test again, you can see that the tests are now passing.
You may also see the expected vs actual response for the selected test iteration:
And you may see the execution details of the selected iteration by going to Execution Tab:
So, in this way, you may add iterations in the CSV file. Just add iterations in your CSV file and run it in the vREST NG Application directly. No need to import again and again. It all just work seamlessly. So, it increases your test efficiency drastically.
(c) Setup Environment
For the generated steps, you may also need to set the initial application or DB state before executing your tests. So that you can perform the regressions in automated way. Some use cases of setting up initial state can be:
- Restoring the database state from the backups
- Execute an external command or script
- Invoke a REST API to setup the initial state
In this section, Let's see, how you may execute an external command before the execution of our tests. As our sample test application is simple and built for demonstrating vREST NG. It stores all the contacts data in a JSON file. So, I already have the initial data in a JSON file which I can copy to our test application project directory before executing the test cases.
You may specify the command as shown in the following image:
The above command will restore the application state from the initial data which is already there in vREST NG Project directory >> dump.json file.
Note: You will also need to specify the cpCmd variable in the Environments section because in Linux/MacOS the command name is cp and for Windows OS, the command name is copy. So, for Windows OS, you may create another environment in vREST NG application. So that your API tests can run on any machine by just switching the environment.
If you are facing any issues related to API Automated testing in your organization, then do contact us. We will arrange the live meeting to discuss your needs and will also provide you the demo of our product vREST NG showcasing its capabilities.
So, this is how easy, you may perform the data driven testing in vREST NG. Let me know if you found this post helpful via comments or do like or share it with your friends and colleagues. And If you have any queries or need any help then you may reach out to me via my LinkedIn profile.
Senior Sofware Engineer | Full-Stack | Typescript, Go, Python, Java, React, NextJs, Vue, Angular, Node, Git, Docker, Serverless, AWS, GCP, ML, Linux/Unix, Automated Testing, Solutions Architect
4 å¹´Dheeraj Kumar Aggarwal This is very impressive. The Excel support is really good for business cases as it allows for the developers and QA people to fill out the data, rather than putting the work on the test automation engineer.