Do Buzzwords Dream of Clearer Substance? The Journey of 'Digital Twins' Toward Becoming an Operational Concept
Jér?me Vetillard
Healthcare Innovation Leader | Business Transformation Expert | Leveraging Data & AI for Impactful Change
A fancy concept but still quite complex to understand and define
Perhaps I’m not fully engaging my LinkedIn audience (maybe I need to add a few emojis for a touch of color ??), or it might be that the concept of 'digital twins'—though often seen as a buzzword—is still not widely understood in terms of practical, operational applications, or both! While it’s a trendy term, there seems to be a gap in understanding how digital twins can truly drive outcomes at scale.
Digital twin definition
Let’s unpack the semantics of 'Digital Twin'! It’s a term made up of two parts:
? What does 'Digital' mean? (And it can mean a lot of different things, IoT, wearables, 5G, Scada, knowledge Graph, AI algorithms...).
? What exactly are we 'Twinning'? (And no, not related to Twinings Tea—just a fleeting thought due to my hot mug full of tea! ).
Each word carries multiple meanings depending on the industrial context and specific use cases, which is why the concept of a 'Digital Twin' itself is so richly polysemic.
"A Digital Representation of a Real Object: For many, the concept of a 'digital twin' is strongly associated with the 3D representation of a real-world object, especially with the rise of augmented and virtual reality. This immersive visualization has become a defining feature in how people perceive digital twins."
As Sébastien Brasseur noted in my post (link), a 2017 study by Negri et al. found that the most frequently occurring words were "physical" and "product," indicating a strong emphasis on a physical product-centered perspective of Digital Twins (DT).
Nonetheless, alternative definitions advocate that the potential of DT should extend beyond tangible physical products, encompassing processes, systems, and even entire organizations.
A review of the literature shows that researchers and practitioners have proposed diverse definitions and applications for Digital Twins (DT), each seeking to clarify their interpretation of the term. Some envision DTs as complex, real-time digital models with predictive and prescriptive capabilities, while others consider them as simple digital representations.
In our discussions with customers about "Digital Twins," it has become clear that there is no universally accepted definition of what a Digital Twin truly is/should be, nor a shared understanding of how it can drive ROI-proven initiatives at scale within their value chains and existing organizational structures.
This is typical in a blue ocean strategy, where "making the market" and educating customers to raise their "Digital Twin" maturity are essential. However, it’s important to account for this factor, as it will inevitably extend the sales cycle.
This conceptual ambiguity, however, risks DTs being dismissed as mere hype due to their lack of clarity and genuine ROI-proven business-case going beyond the sole use-case.
Reflecting on Recent History: Remember Predictive Maintenance?
Predictive maintenance was once hailed as an obvious, high-ROI use case. Though not explicitly labeled as “Digital Twin”-based, it embodied the core concept: a digital representation of a complex physical object, aimed at forecasting part failures so operators could preemptively replace components and avoid costly production downtime. However, the business case ultimately faltered. Many comparisons optimistically assumed a leap from no maintenance at all to a fully predictive maintenance landscape—overlooking the fact that "scheduled maintenance" already existed as a robust practice to reduce downtime. This oversight significantly undermined the projected impact and led to inflated expectations.
Toward a standardized definition of Digital Twins
Digital Twins (DT) comprise two primary components: a physical entity and a digital (or logical) counterpart. Rather than focusing solely on the digital model, DT integrates both the real-world system and its virtual representation, along with the interaction between them.
The physical component, known as the Physical Object (PO), represents the actual item or system—whether a device, product, hardware, or even a physical process operating in the real world.
The digital component, or Logical Object (LO), is a virtual model of the physical system, generated through software that integrates data and algorithms. Often referred to as a digital clone or replica, this model reflects the characteristics and behavior of the physical object—within specified parameters and to an extent to be chosen when architecturing the Digital Twin.
In essence, DT links these two elements: the tangible system existing in the real world and its software-based counterpart, which can simulate and predict the system’s performance and behavior.
Properties of a Digital Twin
As described in the "Handbook of Digital Twins" (editor Zhyhan Lyu), Digital Twins should exhibit the following properties:
Maturity models of Digital Twins
Based on the “entanglement” property, Digital Twins can be classified into various maturity stages:
Some models introduce additional advanced stages, focusing on other aspects:
领英推荐
While the first three stages primarily relate to entanglement and data synchronization, the Cognitive and Collaborative stages focus more on AI-driven insights and user experience, enhancing human collaboration and decision support.
Toward a standardized definition of Digital Twins
"A Digital Twin is a virtual representation of a "system"—whether it be a physical object, complex machine, organism, process, or organization—that mirrors its real-world characteristics and behaviors to a defined extent, with specific levels of reliability and complexity. By integrating real-time or asynchronous data and simulations, it allows for monitoring, analysis, predictions, and, in some cases, interactive feedback loops that enable users to influence and optimize the system’s performance. This interactive capability enhances decision-making and control across the system's lifecycle, creating a more dynamic user experience."
How do we implement this in TweenMe (the first universal digital twin generator) ?
TweenMe Positioning in the Digital Twin Maturity Model
TweenMe is positioned as a universal generator of both Digital Models and Digital Shadows:
This positioning highlights TweenMe's capabilities in creating flexible, predictive, and collaborative Digital Twins adaptable to various maturity levels in digital twin development.
What does make TweenMe a universal DT generator ?
A Digital Twin is a model or abstraction that must be thoughtfully designed to represent the Physical Object (PO) with a level of detail appropriate to the specific business problem, leveraging available production data. Defining the right level of abstraction and data model is critical to ensuring the Digital Twin’s business value. While TweenMe reduces the expertise required and significantly lowers the marginal cost of creating new Digital Twins, aligning the model with business goals remains essential for success.
Example: Modeling the Clinical Benefit of Paracetamol
From a clinical standpoint, I could develop a straightforward model to predict the effects of acetaminophen (paracetamol) as a pain reliever and fever reducer. The clinical question here would be: Given a certain dose of acetaminophen, what level of pain relief and fever reduction can I expect?
A basic dataset might include:
This simple model could be expanded by incorporating additional variables for a more nuanced understanding of the clinical effects.
However, as an analytical thinker, I might also explore what happens once acetaminophen enters the bloodstream, examining its molecular actions and enzyme interactions, such as:
Further, I could delve into modeling the molecular binding between acetaminophen (or its metabolites) and specific receptors, possibly through quantum chemistry simulations to estimate binding forces.
While such detailed modeling—like visualizing the 3D docking of acetaminophen's metabolite AM404 onto CB1 receptors—might be exciting and yield insights at the molecular level, it would not directly address the initial clinical question.
This underscores the importance of defining the right question. Effective Digital Twin development requires balancing data availability, expected outcomes, and development costs against the relevance and reliability of the model, ensuring the solution aligns with the practical needs of the problem at hand to build the optimal data model to address the business question with accuracy and minimal budget.
Once the data model is designed, it can be "executed" in the TweenMe automated data pipeline to produce a Digital Model / Digital Shadow or a Digital Twin.
Example: Developing a Multimodal Model for Senology
We began with an open dataset containing features extracted by a computer vision algorithm from 569 mammograms, each labeled as either benign or malignant.
Our first step was to refine the dataset by identifying clusters of tumors with differing growth speeds, categorizing them as either slowly growing (benign or malignant) or rapidly growing. We then created additional clusters based on tumor surface characteristics, classifying tumors as either "smooth" or "rugged" based on image-derived features.
Next, we built covariance matrices to integrate synthetic genomic and proteomic data, focusing on tumor attributes like "growth speed," "smooth vs. rugged texture," and "size." For this, we applied a range of analytical techniques, including Covariance Matrices, Principal Component Analysis (PCA), Multivariate Linear Regression, Partial Least Squares (PLS) Regression, and Machine Learning models such as Support Vector Machines (SVM).
By examining these covariance patterns, we defined two new output variables: "tumor aggressiveness" and "treatment response", which we linked to proteomic markers such as ER, PR, and HER2. These derived variables provide insights into tumor behavior and potential therapeutic outcomes, enhancing the model’s clinical utility as the model answer was no more limited to only "benign" or "malignant" classification.
Beyond serving as a "Digital Twin" generator, TweenMe also creates/stores data models (ontologies) and insightful databases, which can ultimately be used to train cognitive digital models for integration into digital shadows or digital twins.
Take aways
Directeur Général | Consultant @ Mont Ouest | Strategic Healthcare Advisor
3 个月Jér?me Vetillard that is an excellent analysis! Must-read. Very insightful. I have always found the handbook edited by Zhihan Lyu as a reference. It gives a bit more flavour to it!