Actual Quality is the Best Measure of Quality

Actual Quality is the Best Measure of Quality

Executive Summary

There's a lot of discussion in MedTech about how difficult it is to integrate modern software development tools and techniques into the medical device lifecycle. Traditional medical device design and development practices built over decades work around the unique constraints of building proprietary physical hardware. Modern software development has matured with very different tools and techniques because modern software's biggest challenge is managing the seemingly infinite complexity of what we can build rapidly.

As a result, we are in a situation where software people are from Mars and non-software medical device people are from Venus. This is arguably most evident when we examine these two groups' different approaches to documenting compliance. The software people complain that the MedTech people want too much useless documentation that does not contribute to safety and effectiveness. In contrast, the MedTech people complain that the software people don’t appreciate the reason we must capture all this documentation.

The good news is that while there is a communication gap between these two sides, the gap we need to bridge in actual approaches to success is far narrower than many have come to believe. We’ve gotten ourselves wrapped around the axel about something solvable. Here’s why: Take any software development team in MedTech. Then, invite someone from a top-tier tech company in the Bay Area to do a maturity assessment of the MedTech software team's practices. These practices include test-driven development (TDD), continuous integration/continuous deployment (CI/CD), DevOps, and microservices-based architectures. In our opinion, if that MedTech software team gets high marks in that audit, they are already 80-85% of the way to where they need to be for high-quality MedTech compliance!

The main thing left to do is integrate an extra layer of risk-based approaches into their practices. Once the last 15-20% of integration is in place, that team is positioned to raise the organization’s overall bar for compliance, safety, and effectiveness.

The rest of this article provides a fuller picture of our position and illustrates success through the example of quality and compliance documentation.

In regulated MedTech, why do we talk so much about quality?

In MedTech, when we build a medical device, it is supposed to diagnose, monitor, and/or treat illness to a pre-specified level of performance. In short, we work to ensure that the device does exactly what it promises and does not cause harm by doing things it is not supposed to do. At the end of the day, when a device performs as intended/promised and works safely and effectively, THAT is the definition of a high-quality device.

There's a lot of discussion in MedTech about how we can make modern software work together with our traditional compliance methods designed to deliver safety and effectiveness. Of course, we just push a device into the market and hope it turns out to be high-quality. So, we must find ways to drive quality through a medical device's total product life cycle (TPLC).

We do that through a combination of laws, regulations, quality systems, and compliance monitoring (LRQC as an acronym):

  • Laws: So in the US, this goal is enshrined in the laws that authorize the FDA’s regulation of medical devices, notably including Section 520 of the Federal Food, Drug, and Cosmetic Act (FDCA) and the Medical Device Amendments of 1976.
  • Regulations: Those laws have been elaborated on with (numerous) regulations from the FDA, most notably in Title 21 of the Code of Federal Regulations.
  • Quality Management: A key component of the regulations in Part/Title 21 is the requirement that manufacturers establish a Quality Management System (QMS) to consistently implement an effective combination of people, processes, and technology.
  • Compliance: As manufacturers, we do a lot of work to double-check that our work complies with the QMS, thereby ensuring compliance with the relevant regulations and laws.

LRQC is a proxy

All of this regulation/quality/compliance (R/Q/C) effort is a means to an end: ensuring devices are consistently safe and effective.

R/Q/C is designed to incentivize and enable behaviors and provide proof of adherence to those behaviors. It’s a proxy because if people do these things, it should lead to safe and effective devices. This is a reasonable and proven approach. We’ve built our entire industry (and basically every safety-critical regulated industry, including aviation, automotive, and nuclear power) around this. Still, in the end, we must remember that achieving Compliance with Laws, Regulations, and Quality Systems is not the same as achieving quality. Achieving quality is achieving quality. (Quality is quality!)

The Natural Drift of Focus from Achieving Quality to Achieving Compliance

Unfortunately, many organizations across many industries can become focused on the compliance process over the quality outcome. Part of the problem, in our opinion, is that it is already hard to build a medical device under LRQC. It’s human nature to get so involved in the process that we lose sight of whether our actions are the best proxies for achieving true quality.

It’s worth regularly asking ourselves if the time we spend on LRQC is actually leading us to safe and effective devices. Or do some of our activities become distractions from achieving quality? If something is not adding value toward achieving quality, it subtracts value by taking time and focus away from achieving quality.

This brings to mind a Doonesbury comic published September 18, 1990, set in a foxhole in the First Gulf War. The joke is that US soldiers have to spend so much time simply drinking water to stay hydrated that they don’t have time to fight the war.


DOONESBURY ? G. B. Trudeau. Reprinted with permission of ANDREWS MCMEEL SYNDICATION. All rights reserved.

All of this brings us to the use of modern software in the operation of medical devices.

Quality Management Systems were first developed to support the kinds of engineering disciplines that more traditionally went into medical devices: mechanical, materials, and electrical, all centered on the physical manipulation of biology, chemistry, and physics. These approaches worked pretty well because physical things are more complicated to bend or contort.

Another way of looking at it is that quality management systems came from the manufacturing world, where a consistent process resulted in a consistent product.

However, in software engineering, every ‘product’ is different. The “magical” element of software and data is that they are malleable, i.e., they are much easier to change or move around than in the physical world.

This is one reason why Agile methodologies are so popular: they cater to variability by being flexible in delivering software.

Comparing Software Engineering Quality and Traditional MedTech Quality

Any experienced non-software MedTech person might form an opinion of software based on their daily interaction with the software they use. They may reasonably say,

“Hey, MedTech systems need to be rock solid, dependable, and trustworthy. I use the software every day in most of my professional and personal activities, and while software can be awesome, it’s not reliable like my medical devices.”

Reports in the media don’t help with that perception either. Just look at the nonstop stream of announcements about massive cybersecurity breaches like Optum (Change Healthcare) or major system outages like Southwest Airlines' multi-day \ meltdown. After all, isn’t software the industry that invented the phrase "Move Faster and Break Things" as if that was a good outcome?

So, while a MedTech professional will understand that we have new and different challenges in applying LRQC to newer software technologies such as smartphone apps, AI, and the cloud, they may be skeptical when software people tell me that software quality methods are as good as MedTech quality methods.

The truth is that while the software we use in every part of our professional and personal lives doesn’t “feel” as reliable as what we use in MedTech, that’s generally because that was never the goal of software. Consumers have accepted a tradeoff where they get powerful features that don’t always work consistently, smoothly, or in ways that make sense to us as users.

Ultimately, the goal is “good enough” quality, not the absolute best level of quality. Good enough quality is not a disparaging term. It’s simply achieving ‘consistent’ quality at whatever level consumers require. A QMS, therefore, is just a system of management that facilitates the quality expected by their customers/consumers. If the company consistently produces apps with great features but also bugs, but that is what is expected, that can still be a success.

However, don’t let that mislead you; software engineering as a discipline has a powerful and long relationship with quality. We have very sophisticated methods of software quality assurance at our disposal that can be employed to ensure that software solutions perform as intended.

Just think about this: When did you last go to Google.com, and the website was down? Or the last time you went to Amazon.com but couldn’t complete a purchase due to a system outage? While these companies are not regulated in the way MedTech is, they have substantial financial incentives to provide incredibly high levels of system reliability. And they achieve that through a range of very powerful approaches.

Without going into a long digression about the roots of quality as a discipline, it suffices to say that you'll find very common DNA as you trace back the origins of the software quality movement and the industrial quality movements. Both lead back to pioneers such as Deming, methods such as statistical quality control, and mitigations against human neuroscience such as Human Factors design and testing.

Shifting Technical Documentation from Documents to Data

IT has done a fantastic job helping their businesses move from manual and document-centric processes to automated and data-centric ones. After all, this is why 21 CFR part 11 came about: to provide that guidance around electronic records. What is crazy is that all too often, IT has not done this for themselves! For example, too many IT teams are still creating User Requirements Specifications in document form rather than working with electronic records in tools like JIRA.

In the SW development field (outside of MedTech), there has been a steady evolution from written to data-driven documentation over the years. For example, in the earlier years of Waterfall-driven software development, requirements and architectural design documents were created in documentation-style forms (e.g., a word processing document template used in Microsoft Word or a diagram put together in PowerPoint using stock images and symbols.) The same was true for software test cases and the subsequent documentation of test results. But over the last two+ decades, this has evolved tremendously with agile methods and tools!

One of the core principles of The Agile Manifesto in 2001 was that “Working software is the primary measure of progress.” That means that rather than creating documents representing what the software code would do, we would focus on making the actual software in small, controlled development steps. This would allow us to see the exact software and quickly determine if it met its intended needs in terms of functionality and proper operation (i.e., defect-free). That was followed by the emergence of agile-inspired software development tooling built around those principles.

Software development documentation was increasingly captured not in documents but in data captured in databases. Content in Word documents that once captured requirements and test cases shifted to being part of systems like JIRA and Confluence, which were more structured and dynamically linked to the actual software code.

Things took another leap forward in the shift from documents to data with the adoption of cloud computing, coupled with the adoption of DevOps as the Agile+Cloud evolution of traditional computing infrastructure management. Suddenly, you have things like Infrastructure as Code, where instead of standing up a server with specific hardware, OS installs, and configurations, you instead do that virtually with code since the server is actually “logical” and not “physical” Infrastructure as code speaks for itself as documentation, and when there’s a need to explain why the code does certain things, that can be added with in-line comments which say, “The code is written to do x because of y reason.”

Similarly, with the rise of cloud computing, DevOps, and Cybersecurity, the importance of telemetry in computing (e.g., log files) has risen. Because storage is so cheap, we can “instrument” everything that happens on a computer and keep a record of it for monitoring and analysis.

We see the impact of this within the automation of compliance verification:

Once upon a time, if we wanted to know if someone correctly configured a specific server setting for cybersecurity, they would check it off on a checklist and maybe even capture a screenshot of the configuration in a GUI window.

Similarly, if we wanted to document that we periodically retested security, we would record this as a manual exercise somewhere.

Today, however, that documentation is all automatically recorded in a log file. Queries can be performed to present documented evidence. This is far more accurate than relying on a human to check off items in a checklist manually.

While you can’t do everything with automation, there is A LOT you can do with automation. This is why the software marketplace for logging/telemetry/automation tooling has grown so much.

However, we still see today that a QMS often mandates manual checklists rather than allowing IT to leverage the automation tooling they already have at their disposal.

Putting It All Together

Looking back at the bigger picture, LRQC is a scaffolded system of systems designed to ensure that safe and effective devices are consistently created. Over the years, MedTech has developed a broad set of tools and techniques grounded in biology, chemistry, physics, and the engineering disciplines that deal with tangible, physical products. However, as we’ve discussed, modern software, such as smartphone apps, cloud services, and AI, are, by their very nature, more flexible and adaptable. Yet the traditional mechanisms of LRQC aren’t necessarily the best ways to handle software’s dynamic and malleable nature.

However, the core requirement remains the same: we must maintain safety and effectiveness.

The complicated part is merging these two domains in a way that does not involve simply smashing them together. Many experts on LRQC already “know what they know” through hard-won experience, and they’re not necessarily trained or inclined to keep asking, “why?” Meanwhile, software teams use a range of foreign-looking and sourcing practices. Ultimately, bridging this divide requires mutual understanding and a willingness to learn from one another.

Automating Quality and Compliance in a Way That Truly Facilitates Quality

Historically, evidence of compliance in MedTech has almost always been captured in document form, Word documents, PDFs, or manually maintained spreadsheets. There’s a decisive comfort factor here: LRQC auditors and regulators have grown accustomed to creating documents as the de facto method of proving adherence to quality standards. This has worked well enough for hardware-centric devices and remains the norm in many organizations.

However, as software-based devices emerge, there’s a significant opportunity to evolve from a purely document-centric approach to a more data-centric one.

This shift is the latest part of a decades-long trend toward digitizing everything. Once, we relied on a Word template for standard operating procedures (SOPs) or typed out test records in Excel. Now, we can capture and manage that information in structured, automated ways.

Two Domains of QMS Content

When it comes to quality management, there are two key domains of information to consider:

  1. Information on what we’re supposed to do to stay in compliance. This includes content like SOPs, forms, checklists, and work instructions that spell out the steps and responsibilities for any given process or activity.
  2. Information proves that we did what we were supposed to do. These records or logs provide documented proof of each step performed, demonstrating fidelity to the prescribed process.

Transitioning documentation into structured, data-driven formats opens up new possibilities. Information is no longer “trapped” in unstructured word processing documents; instead, it can be automatically captured, traced, and, if needed, validated.

Infrastructure as Code

A perfect example of possible modernization is using infrastructure as code (IaC). In the early 2000s, an SOP might detail every last step for managing a server down to OS configuration, data backup procedures, and step-by-step screenshots. This approach led to the creation of massive word-processing documents that constantly needed updating whenever the software or OS changed. The server setup might drift away from the official “reference” document, leaving the SOP woefully outdated.

By contrast, IaC solutions (e.g., Terraform, AWS CloudFormation, Ansible) let you specify precisely how a server or environment is configured in code. That code essentially documents itself. So, rather than writing an SOP that restates the infrastructure configurations, your SOP can simply reference the IaC scripts. Since the configuration and code are always in sync, you minimize the risk of outdated documentation. Updates or security patches get reflected in the code repository, and the SOP only needs to say, “Follow the IaC script at [this repository link],” possibly with explanatory notes or comments embedded right in the code.

Because we can now automate many tasks, we can regularly apply new insights or best practices across all cloud regions and tackle emerging cybersecurity threats much faster without having documentation become a maintenance burden. Manual steps and checklists that used to be updated quarterly can now be updated continuously in a controlled, versioned code environment.

Documenting Proof of Compliance

On the evidence side, the story is similar. Where we once used checklists to confirm steps were completed, we can now rely on structured log files that automatically capture proof of each event. For instance:

  • Instead of having a line in a Word document checklist that says, “Run spell check before document sign-off,” imagine a system log that can definitively record, “Spell check was executed at 2:17 PM on January 10, 2025, and sign-off occurred at 2:19 PM on January 10, 2025.”
  • Instead of manually verifying that only active employees have login credentials, an automated script can check daily for ex-employees and log the results, complete with timestamps and system verifications.

So, which kind of proof should be more trustworthy for an auditor or regulator: a manually initialed checklist or a system-generated log that precisely details when each step was taken? In most cases, the latter is more accurate, timely, and secure.

Still, because these machine-generated logs don’t look like a traditional paper document, many quality and regulatory professionals may reasonably fear that digitally-created log files may not with an auditor or regulatory inspector. That fear, while understandable, is increasingly out of sync with how modern industries operate, especially those that need near 100% uptime (e.g., trading platforms, payroll systems, streaming services). These other industries have demonstrated that these methods are robust, reliable, and produce a clear, traceable activity record. We believe that MedTech can realize those same advantages once we become comfortable with the type of documentation.

A Real-World Example

Here’s how it might look in practice:

  • Before AWS became ubiquitous, one of us (Ian) helped move 默克 ’s medical devices to Agile methodologies, using Jama for requirements instead of JIRA.
  • Today, many teams do the same with JIRA: rather than having a giant User Requirements Specification (URS) document, you control and approve each story directly within JIRA.
  • Installation Qualifications (IQ) are automated through DevOps pipelines
  • Operational Qualifications (OQ) are captured as structured test data rather than Word documents.
  • That data flows into a data lake, automatically preserving a traceability matrix for audits.

Wherever possible, we replace manual QA gates with self-documenting automation.

Far from being a “shortcut,” these approaches often enhance quality. They can remove human error, reduce rework, and enable rapid iteration, ultimately speeding development while improving the accuracy of the final deliverables.

An added benefit is that integrating cross-functional efforts like cybersecurity, privacy, risk assessments, and human factors engineering becomes far more straightforward. When these activities feed into the same data-driven frameworks, you reduce the friction and overhead of maintaining multiple parallel processes. The complexity of the work doesn’t vanish, but the “gear grinding” that often plagues large teams is mitigated, improving efficiency and morale.

Why Is This So Hard in Practice?

If all of this is so beneficial, why do these efforts often meet resistance within MedTech organizations?

In our opinion, It comes down to a culture clash. Everyone agrees that safety and effectiveness are paramount but disagree about “how” to get there:

  • LRQC professionals have spent careers mastering the letter of the regulations. They’re experts in guiding organizations through audits, inspections, and the many forms of compliance scrutiny. The idea of pivoting away from a known and trusted approach to something unfamiliar can feel risky or even dangerous.
  • Software teams see the friction and overhead of maintaining traditional documentation and push for more automation and tooling. They grow frustrated when Q&R teams can’t immediately appreciate that automated logs provide as much (if not more) clarity than a PDF checklist.

It’s like a theological debate over “what’s canon.” In Q/R/C, some individuals see only the original regulations text as the unassailable truth. In contrast, others believe these rules must be interpreted in context and updated to reflect modern realities. It’s not so different from the debates over biblical translations, the Constitution's original intent, or which edited version of Star Wars Episode IV is “the real one.” We’re not here to pick sides in those debates, but the parallel is clear: people gravitate to what they see as the authoritative source. They are naturally wary of reinterpretation even when new interpretations might make sense in light of evolving technology.

The Way Forward: A Middle Ground

The encouraging news is that there is a middle ground that can satisfy both sides… if everyone is willing to learn a bit from each other.

LRQC teams (attorneys, regulatory professionals, quality managers) need to:

  • Embrace new formats of evidence and documentation and accept that code, logs, and other data-driven artifacts can be just as trustworthy, if not more so, than a PDF or Word document.
  • Understand that “dynamic” records in a database can be more accurate than a static snapshot.
  • However, we also understand that snapshots can be automatically generated on demand for an audit.
  • Engineers and Developers need to recognize that Q&R teams need frozen snapshots of truth at specific points in time, whether it’s a release milestone or a regulatory submission. Yes, a version-controlled database or code repository is excellent, but a separate, explicit snapshot (e.g., PDF report) is sometimes required to fulfill a well-established auditing procedure.
  • Then, instead of complaining about this step, engineers need to automate it! They can let scripts generate the PDFs or CSV exports whenever the QMS calls for them. Remember: Disk storage is cheap, and automation can transform these tasks from painful to trivial.

In other words, meeting in the middle involves merging the best of both worlds. You keep the speed and flexibility of software-driven processes while delivering the documented, auditable checkpoints LRQC professionals rely on. Over time, as the organization grows more comfortable with automated proofs of compliance, the fear subsides.

Conclusion and Call to Action

The MedTech world is at the intersection of cutting-edge technology and rigorous regulatory oversight. Software-driven devices are no longer a novelty; they’re becoming the norm. We must adapt our compliance and documentation strategies to align with modern software practices to keep pace with this evolution and ensure we’re still delivering truly high-quality (safe and effective) products.

  • Q&R professionals: Educate yourselves on DevOps, Infrastructure as Code, and automated logging. Explore how these can improve documentation fidelity and risk mitigation.
  • Software teams: Learn to speak the language of compliance. Understand why snapshots, versioning, and explicit sign-offs matter in a regulated space and build automated pathways to provide these artifacts seamlessly.

This shift doesn’t mean reinventing the regulatory wheel. It’s an opportunity to improve our processes with the powerful tools software offers, ensuring that “quality” remains focused on actual product performance and safety, not just checking boxes in a static document. By working together and taking deliberate steps toward a more data-centric approach, we can bring out the best in LRQC and software innovation, delivering reliable, compliant, and groundbreaking medical devices in an ever-evolving landscape.

Now is the time to ask yourself, “How can my team start implementing these changes?” Whether that means spinning up a pilot project that uses IaC for infrastructure management or automating a single compliance log, small steps will help you gain confidence, build momentum, and demonstrate how much value this shift can bring to MedTech organizations.

The result? Less friction, more accuracy, and an unwavering focus on accurately measuring quality, safe, and effective devices that patients, clinicians, and regulators can trust.


Daniela Deflorio

Digital Innovation in Clinical Trials I RWE data and Quality I Startup Advisory& Mentorship

3 周

Great perspective on medical devices regulatory challenges. When new technologies are to applied to clinical trials, it gets even more regulated with Good Clinical Trials (GCP) We need more innovative solutions to make compliance in these highly regulated environments, less daunting

Aaron Joseph

Streamlined Compliance for Medical Device Development

4 周

Great insights into a central challenge for medical device software: How to manage the collision of these two worlds of traditional methods for compliance and modern methods for software development?

要查看或添加评论,请登录

Orthogonal的更多文章