Why Improved Data Management, Not openEHR, Will Solve Healthcare’s Interoperability Challenges
Tito Castillo (FBCS CITP CHCIO)
Enterprise Architect & Data Management Consultant
The Promise and Limitations of openEHR
The promise of openEHR as a solution to healthcare interoperability has generated significant enthusiasm. Its archetype-based approach offers a standardized way to model clinical data, and its open nature suggests freedom from vendor lock-in. Yet, as the healthcare industry grapples with the complexities of interoperability, it’s becoming clear that openEHR alone is not the silver bullet many hope for. The real key to unlocking interoperability lies not in the widespread adoption of any single standard but in the foundational improvement of data management practices.
At its core, openEHR is designed to tightly couple the persistence layer—how data is stored—with the data exchange model. While this approach has merits, it also introduces significant constraints. Organizations that adopt openEHR must commit to a specific way of storing and managing data, which may not align with their existing systems or future needs. This coupling limits flexibility and can create a form of standards lock-in, where organizations become overly dependent on a specific standard, limiting their ability to adapt or integrate with other systems.
Moreover, openEHR’s complexity cannot be overlooked. Its archetype-based modeling is powerful but requires significant expertise to implement and maintain. For smaller organizations or those with limited resources, this steep learning curve can be a barrier to adoption. Even for larger organizations, the effort required to fully leverage openEHR’s capabilities may divert resources from other critical areas, such as improving data governance or investing in metadata management.
Perhaps the most significant limitation of openEHR is its focus on healthcare data. While this makes it well-suited for clinical use cases, it falls short in addressing the broader interoperability challenges of the future. Healthcare is increasingly intersecting with other domains, such as social care, environmental data, and public health. True interoperability will require the seamless integration of healthcare data with these non-healthcare datasets, a task for which openEHR is not ideally suited. In cross-domain scenarios, the ability to map, transform, and harmonize diverse data models is paramount, and this is where openEHR’s tightly coupled approach struggles.
The Risk of Standards Lock-In
The concept of standards lock-in is becoming increasingly relevant in discussions about healthcare interoperability. While vendor lock-in—dependency on a specific vendor’s products or services—is a well-known challenge, standards lock-in refers to the risk of becoming overly dependent on a particular standard, even if it is open and vendor-neutral. This can limit an organization’s flexibility and ability to adapt to new technologies or integrate with systems that use different standards.
For example, organizations that adopt openEHR may find themselves locked into its ecosystem of tools, vendors, and expertise. While openEHR is an open standard, its tightly coupled approach to persistence and data exchange can make it difficult to integrate with systems that use different standards. This creates a form of standards lock-in, where the organization’s ability to innovate or adapt is constrained by its deep integration with a specific standard.
Standards lock-in is particularly concerning in a rapidly evolving healthcare landscape, where new standards and technologies are constantly emerging. Organizations that become locked into a specific standard may struggle to keep pace with these developments, limiting their ability to achieve true interoperability.
Contrasting openEHR with ISO 13606 and FHIR
To fully understand the limitations of openEHR, it’s helpful to contrast it with other standards like ISO 13606 and FHIR. Each of these standards has its strengths and weaknesses, and they serve different purposes in the interoperability landscape.
openEHR: A Comprehensive but Constrained Approach
openEHR is designed to provide a comprehensive solution for clinical data modeling and persistence. Its archetype-based approach allows for detailed and reusable clinical knowledge representation, making it well-suited for complex healthcare use cases. However, its tight coupling of persistence and data exchange limits flexibility, and its focus on healthcare data makes it less suitable for cross-domain integration.
ISO 13606: A Middle Ground Between FHIR and openEHR
ISO 13606 occupies a middle ground between FHIR’s simplicity and openEHR’s complexity. Like openEHR, it is based on a detailed clinical modeling approach, but it is more focused on data exchange rather than prescribing a specific persistence layer. Here’s why ISO 13606 could be equally valid (or even preferable) in scenarios requiring detailed clinical modeling:
领英推荐
FHIR: The Modern Standard for Interoperability
FHIR (Fast Healthcare Interoperability Resources) has emerged as the leading standard for healthcare interoperability. Built on modern web technologies (JSON, XML, RESTful APIs), FHIR is lightweight, extensible, and easy to implement.
Unlike openEHR, FHIR does not prescribe a specific persistence model, allowing organizations to use any underlying data storage approach. FHIR’s modular structure (resources) makes it highly adaptable to different use cases, and its broad industry support ensures a large ecosystem of tools and resources. However, FHIR’s simplicity can be a limitation in scenarios requiring detailed clinical modeling, where an archetype-based approach may be more suitable.
The Need to Consider Data Across Its Full Life Cycle
A critical oversight in many interoperability discussions is the failure to consider data across its full life cycle. Data management is not just about exchanging data; it’s about managing data from creation to disposal in a way that ensures quality, consistency, and usability. This requires a holistic approach that addresses every stage of the data life cycle:
Why People Are Attracted to openEHR Systems
Given the limitations of openEHR, why are so many people attracted to it as a solution? The answer lies in a combination of factors, including the appeal of a seemingly turnkey solution, the complexity of improving core workforce capabilities, and the allure of avoiding difficult organizational changes.
First, openEHR presents itself as a comprehensive, ready-made solution. Its archetype-based approach promises to standardize clinical data modeling and eliminate the need for custom integrations. For organizations overwhelmed by the complexity of interoperability, this can be an attractive proposition. It’s easier to adopt a system that appears to solve multiple problems at once than to undertake the hard work of improving data management practices across the organization.
Second, improving core workforce capabilities is a daunting task. It requires significant investment in training, change management, and cultural transformation. Many organizations lack the expertise or resources to undertake such an effort, especially when faced with competing priorities and budget constraints. In contrast, adopting a new system like openEHR can feel like a more manageable and immediate solution, even if it doesn’t address the underlying challenges.
Third, openEHR’s promise of vendor independence and open standards resonates with organizations that have been burned by proprietary systems in the past. The idea of breaking free from vendor lock-in is appealing, even if it means adopting a new set of constraints. However, as we’ve seen, openEHR’s tightly coupled approach can create its own form of standards lock-in, limiting flexibility and adaptability in the long term.
Finally, there is a natural human tendency to seek out technological solutions to complex problems. It’s easier to believe that a new system or standard will solve interoperability challenges than to confront the reality that these challenges are rooted in organizational and cultural issues. Improving data management requires a shift in mindset, a commitment to long-term investment, and a willingness to address difficult questions about data ownership, governance, and quality. For many organizations, this is a harder path to take.
Conclusion
While openEHR and other standards have an important role to play, they are not the ultimate solution to healthcare’s interoperability challenges. The real solution lies in improving data management capabilities. By focusing on data governance, metadata management, semantic interoperability, data integration, and provenance tracking, healthcare organizations can build interoperable systems that are flexible, scalable, and future-proof. This approach not only addresses the limitations of standards like openEHR but also ensures that healthcare systems are ready to meet the interoperability challenges of the future, both within and beyond the healthcare domain.
The attraction to openEHR systems is understandable, but it reflects a desire for quick fixes rather than a commitment to addressing the root causes of interoperability challenges. True progress will require organizations to invest in their core workforce capabilities, embrace a culture of data excellence, and recognize that interoperability is not just a technical problem but a data management problem. Only then can we hope to achieve the seamless, cross-domain interoperability that the future of healthcare demands.
Author: Dr Tito Castillo FBCS CITP CDMP CHCIO
Tito is the founder of Agile Health Informatics Ltd, a specialist health and care IT consultancy service. He is also Board Member of the British Computer Society Faculty of Health and Care (Strategy & Policy Lead).
Executive Director | CIO | CTO | Digital Transformation |
1 周Don’t think it’s either, think it’s both and over time.
Healthcare Architect & Technical Leader @ NTT DATA | Data-Driven Innovation
1 周Interesting Article Tito. Its interesting as improved data management is absolutely key to solving healthcares interoperability challenges. I think there is one other ingredient needed in the mix to help ground some of the challenges. Identifying the practical regional and national use cases that will drive the interoperability needs. This also starts to hint at "data as a product" thinking. So much to think about! Thanks for the article.
Author of 'Enterprise Architecture Fundamentals', Founder & Owner of Caminao
1 周Taking from John O'Gorman, '?Like every other data interoperability challenge, the data itself is not the best place to try to solve them.' But semantic interoperability only goes half mile; next, Health care is the perfect example of the benefits of symbolic twins. https://caminao.blog/overview/knowledge-kaleidoscope/ea-symbolic-twins/
Chief Technology Officer at Savana - MSc, PhD Computer Science - AI in Biomedicine - Bioinformatics- Biomedical Informatics
1 周Tito Castillo (FBCS CITP CHCIO) while I agree that data governance is key in the healthcare sector, I don't see why or how that excludes openEHR. Once a health data persistence standard is adopted by the industry, there is no need for data exchange standards. I don"t quite understand that the persistence and exchange models are tigthly coupled in openehr, I would rather say that they are the same model, which is good, should openehr become a de facto standard. The key issue in healthdata mgt, which is in my opinión correctly addressed by openehr, is that the knowledge about data is separated from applications logic. That is, for me, the key strength of openehr. The risk for standards lockin is in my view small, as you can easily migrate archetypes to other metadata constructs. But we need clinical applications that make use of decoupled data knowledge constructs.describing the complexity of health data, or vendor lockin will continue to be a reality.
Disambiguation Specialist
1 周Tito Castillo (FBCS CITP CHCIO) - Like every other data interoperability challenge, the data itself is not the best place to try to solve them. Standards at that (data) level collide with other standards so it's difficult to make and sustain progress. On the other hand, language, when properly and consistently managed, is a much more sensible place to start. Semantic interoperability is an 'upstream' undertaking.