What DIGITAL TWIN does . . .

What DIGITAL TWIN does . . .

The following is an operational definition of Digital Twin (DT) – not what it IS but what it DOES.

Some of the steps are mature enough to be deployed in production; others not so much. I also point out the next developments in Internet-of-Things (IoT) which will take Digital Twins to widespread use.

1.??????Sensors:

Data is the life-blood of DT. Engineering sensors such as accelerometers, temperature sensors, acoustic, vision, etc. are the source of the data pipeline. Others such as IT logs, call center recordings, etc. can be useful but they are not primary for mainstream DT use cases.

What is needed:

·??????Built-in wireless connectivity.

·??????Self-powered sensors (no battery replacement).

·??????Sensors built into products rather than after-market attachments.

2.??????Connectivity:

Wireless connectivity options for sensors have proliferated in the recent past. With the advent of 5G and telco’s imperative to connect all things for their improved business performance, the situation is primed. There are also multitudes of LAN and PAN solutions to choose from.

What is needed:

·??????Weak point is security; one can never have enough security - bad actors will find a way! However, a natively-encrypted end-to-end deployment may mitigate some of the dangers. There are other threat surfaces but this may be the most vulnerable.

·??????For condition monitoring, the current level of network reliability may be sufficient. But when we go to “closed-loop” use cases (see below), near-perfect (“wire level”) latency and uptime performance will be required.

·??????Edge processing is always a smart way of orchestrating data processing. Shortening the latency, reducing back-haul load and isolating for local-specific needs such as privacy will drive this.

3.??????Data Platform:

With hyperscalers (Azure, AWS, GCP) offering excellent IoT capabilities to gather, store and manage data, these enablers of Digital Twins have nicely matured. There are also special-purpose platforms tuned for specific industries and data types.

What is needed:

·??????However much one “democratizes” IoT, I do not believe that it is a pure “plug-and-play” application. Domain knowledge and nuanced deployment considerations are what separates a PoP or PoC from deployments that generate steady business value. Hyperscalers need to invest more in local (by industry or geography) partners who can fan out and work hand in hand with industry deployments.

·??????There may be more gains to be had with improved transport protocols and stream processing of time series data.

4.??????Information Extraction:

From the large volume of data collected from IoT deployments, making sense of it is hard. Making data “visible” is only a first step. What is necessary is to extract information that is not apparent in a way that is relevant to the business. This process is called System Analysis; it is the results of this step that embody the Digital Twin – DT pulls together all the work in the previous steps in a way that is actionable. Some refer to this step alone as “digital twin” but I believe that all the steps taken together makes a Digital Twin.

Visualization is an important part of it – a pleasing display can convey “a thousand words”. 3D rendering, CGI, metaverse, all of these techniques will help. But the pictures are worth only as much as the underlying information they display!

What is needed:

·??????Current DT’s are adequate for “condition monitoring” in a generic sense. For operational and productivity actions, we need to go much deeper.

·??????All major IoT installations have multiple channels of data simultaneously collected – this is for a reason. The end points that are sensed are related in some fashion! Current processing techniques handle a single channel at a time – we are ignoring the key relationships among the end points (such as their cause-effect relationships) and worse, this approach renders single channel results erroneous.

·??????You can understand the perils of single-channel approach and how to extract actionable information properly from multi-channel data from this article: “Multichannel IoT Causal (MIC) digital twin: Counterfactual experiments on Fence Graphs”. https://pgmad.medium.com/multichannel-iot-causal-mic-digital-twin-counterfactual-experiments-on-fence-graphs-df884a9f5f35

5.??????Closing the Loop:

In some sense, this is the step that adds most value in a Digital Twin; “Closing-the-Loop” is immature in mid-2021. For stand-alone, real-time operation of a DT, information extracted in the previous step has to be fed back to control some operational aspect(s) on the ground – a closed-loop IoT system. This is when the full business value of increased production and quality, less waste and better utilization of human resources will be realized.

No alt text provided for this image

I have included all five blocks in the definition of a Digital Twin – why?

Let us compare Digital Twin to the human nervous system and its three sub-systems: (1) CNS – Central Nervous System, (2) PNS - Peripheral Nervous System and (3) its output system, Musculoskeletal System (MSS). The following table captures the analogy.

No alt text provided for this image

I consider Digital Twin as a combination of the 3 sub-systems; some consider DT as just the middle row (DT-CNS). Without getting mired in a style debate, I will point out that bringing in data and controlling actions (in a future setting) are essential parts of a Digital Twin; that those functions may be shared with other applications should not blind us from architecting those pieces carefully. So in a Digital Twin all FIVE blocks in the diagram are essential parts.

I look forward to the day DT-CNS and DT-MSS get to full maturity; I am very optimistic of the path ahead!

Dr. PG Madhavan

https://www.dhirubhai.net/in/pgmad/

#Digitaltwin #IoT #Multichannel #Causality

?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了