A practical approach to Agile Testing - part 1
While reading different articles about Quality Assurance (QA) in Agile projects recently, I noticed that they all focussed on explaining about the artefacts such as early testing, teamwork, etc. Despite some searching, I struggled to find any examples of how the QA is integrated into those artefacts in Agile process. So, in this article, I will explain how we integrated QA into Agile artefacts here at Triad.
In first part of this article, I will explain about integrating quality into Agile artefacts such as user stories, definition of ready, definition of done and documentation. While the second, available in a fortnight, looks more closely at the role of test environments, ticket workflow, test automation and team communication.
What is Agile software development?
As described in “Agile methodology: Friend or foe?”, Agile software development is a mindset and a culture that helps development teams respond to the variability around how new product development “should” be done. With work divided into sprints – each running for a predetermined time period – teams are given regular feedback so that they have numerous opportunities to assess their progress and align what they are doing with the needs of the client.
During this process, the requirements and solutions evolve through collaboration between self-organising cross functional teams and stakeholders. The focus of the Agile manifesto is “Continuously delivering working software while allowing for and supporting changing requirements”.
More and more organisations are adopting Agile methodologies, due to the benefits of faster application development cycles and quicker turnaround. However, shorter and faster development cycles can cause quality issues, hence the importance of integrating testing into the process, rather than as an activity that occurs at the end of the development phase. That is where Quality Assurance (QA) comes in.
Quality throughout the development life cycle
QA activities need to be integrated into the entire development life cycle in order to assure everyone that the quality occurring in the software is adequate. Quality must be on the minds of all and built into the team’s day-to-day activities, rather than occurring at the end. It should be built into project processes from initial requirements gathering, through to development and testing activities.
Quality in user stories
A user story is a description of a software feature from an end user’s perspective. It describes the type of user, what they want and why. A user story helps to create a simplified description of the requirement.
User stories should take account of INVEST principles (but not necessarily be rigidly held to them as not all stories will satisfy all criteria, but they should satisfy most).
Importantly, a user story may be refined many times before being implemented. A good user story should contain details such as a description, acceptance criteria, background material and relevant screenshots or wireframes.
Acceptance criteria are the conditions that have to be fulfilled in order to mark the story as “Done”, which is exposed to users, stakeholders and the team. They contain different scenarios explaining the workflow of the feature and can be written using BDD Gherkin syntax following “Given, When, Then” format. The acceptance criteria should cover the positive (happy path) and negative scenarios. It is also advised to include any specific non-functional requirements that are associated with the user story.
Definition of ready
A definition of ready is the set of criteria agreed amongst the team in order for the story to be considered ready for development. A definition of ready could stipulate, for example, that the story should contain a proper description with background, acceptance criteria are complete, relevant screenshots or wireframes have been attached, and business rules have been fully documented.
Definition of Done
Definition of Done (DoD) is a shared understanding of what it means for a user story to be complete. Each Agile team will have its own DoD checklist of features and activities. Basically, a team agrees on a list of criteria which must be met before product increment can be considered “done”.
The idea of DoD is that it ensures everyone on the team knows exactly what is expected of everything the team delivers. It ensures transparency and quality fit for the purpose of the product and organisation.
Here is the checklist for definition of done:
· Coding is completed for the presumed functionality
· Peer code review performed
· Project builds without errors
· Unit tests written and passing
· Integration tests written and passing
· Project deployed to a test environment
· Testing performed and issues resolved
· Accessibility tests passed
· Feature meets the acceptance criteria
· Project documentation updated
· Relevant automation test scripts created, peer reviewed and checked into version control
· Tested on devices/browsers listed in the project assumptions
Each team should collaborate and come up with the definition of done that suits its unique environment.
Documentation
It is always useful to maintain project documents such as user research outcomes, technical implementations, functional and non-functional requirements, service standards etc. in a central repository such as Confluence or SharePoint. This documentation is ideally used to share knowledge while creating a living document for the team.
REST APIs can be designed and documented with tools such as Swagger. Swagger files can also be used to validate the API endpoints.
Source code should be self-explanatory. Whilst comments serve to document the code, it is advised to use comments only where it adds value and to avoid liberally commenting code that is already readable.
Manual and Automation test cases can be written using BDD Gherkin syntax following “Given, When, Then” format which can be used as a living document for the application.
Don’t forget to look out for part two available in a fortnight, which looks at the role of test environments, ticket workflow, test automation and team communication and how the business and Agile development teams can benefit by optimising these artefacts.