Test as you fly, fly as you test: Good test management for better medical devices
Verification has long been accorded much greater importance in system development than was the case just a few years ago. Negative examples such as the unsuccessful launch of Ariane 5 due to a software error, the most expensive of all time, have not been the only reason why early testing has become established in engineering.
In this day and age, the perfect development scenario would look like this:? Rather than coding first and testing afterwards, developers would do both in parallel to catch and fix software bugs early on. This would largely rule out unwelcome surprises when it comes time to validate and verify the final system before delivery. So much for the theory - the reality in our imperfect world is that testing pitfalls can jeopardize a release.
In this article, we take a closer look at test management challenges that arise in the various phases of medical software development and tell you how we resolve these issues.?
Getting started is the hardest part
Software development projects typically begin by listing requirements such as "Component A may not communicate with component B." or "Measured current has to be verified to be within the specified range.” These are not good requirements because they are untestable, because a negative is difficult to prove, or because they lack a clear definition of who does what and when it has to be done. Often, requirements are assigned to the wrong level. Sometimes they are not functional requirements at all, but rather quality standards, design specifications or measures to manage risks.?
This is why a test manager should be on board from day one to help review requirements and make sure they are testable. After all, the only good requirement is a testable requirement. The twin peaks model helps developers keep their requirements straight. It represents requirements and architecture as two mountains of equal elevation and standing. Refined in an iterative and reciprocal process, these peaks constantly evolve. Requirements should also have a clearly defined lifecycle. Once established, they should not be easy to change.
A test strategy – don’t leave home without it
It is a good idea to prepare the first draft of a verification plan outlining the test strategy during the planning phase. The first to-do is to identify the levels at which testing is to take place. This plan goes on to spell out test requirements, test end criteria, acceptance criteria, exceptions, tools used and the like for each indicated test level. A common mistake made here is to prematurely define specific acceptance criteria – for example, 100 percent branch coverage in unit testing. Defensive programming guidelines would make this objective very difficult, if not impossible, to achieve. In this scenario, the verification plan is sure to give rise to pitfalls in later phases of development. If it is adopted too soon, it will be hard to change later down the line. The best practice is to keep the verification plan flexible beyond the project planning phase to allow for adaptation in later phases.
Our recommendation: It pays off to think about and set down an appropriate strategy early on, but not to finalize the test plan too soon.
At ITK, we generally take a risk-based approach to the test strategy:
领英推荐
Release day coming up
Be very flexible – that is the golden rule when planning and performing tests. Recalibrating development and verification plans and adjusting the strategy are things that can and often have to be done. A well-defined, properly communicated process for regression tests is another imperative, and it must include an impact analysis. At some point, release day will be knocking at the door. And who has not heard a team member answer with the often repeated refrain: "Release is next week. We had better run all the tests now!" or "But I just changed one little thing!" This is why coding and testing should be done in parallel – to catch and fix bugs early on. It is also a good idea to define a code freeze time and stick to it. The same goes for regression tests: Get them done in good time, preferably automated.
Who does what?
"I thought that was your team’s responsibility," or "Please send us all the requirements – we’d like to take a look at them.” If people are saying things like this at an advanced stage of the project, something has clearly gone wrong in terms of communication, transparency and trust. And that shortcoming may well cause the project to fail.?Clear, open communication is critical – not only within a team, but also between the medical device manufacturer and the development service provider. Responsibilities in the development process have to be clearly and horizontally delineated in the V-model. This includes scheduling meetings to share information and designating stakeholders to take part. It is also advisable to allow sufficient time for reviews, change requests and regular information-sharing sessions.
No need to keep reinventing the wheel
Finally, we would like to point out that nothing trumps experience. This is why it is so important to collect and document the lessons learned. Rather than merely generalizing about what went right and what went wrong, dig deeper: Analyze specifically every issue where the team struggled. Challenge the decisions that were made. Which strategy worked and why did it succeed? What was the difference between actual and budgeted costs? Which tool was best suited for which task? The only way to continuously improve test management for your medical device is to review the details.
Do you want to learn more about this topic? Then be sure to catch our colleague Dr. Julia Scherrer's presentation at the Embedded Testing conference on February 27 at 4:20 pm in the Holiday Inn Munich-Unterhaching. You will also have the opportunity to put your questions directly to Dr. Scherrer in a one-on-one session. We look forward to seeing you there!
Curious now? Then don't miss this and subscribe to our newsletter here and follow ITK-Engineering.