The Modeling of the Twins

The Modeling of the Twins

Series: Digital Twins - Part 3

Digital twins require a rethink: Physical interfaces and communication protocols are losing importance. Model-based data interfaces for networking different twin categories within an IoT application are the current challenge.

For cloud operators, digital twins are a very significant building block in the functional portfolio, offering very high growth potential with high customer retention. The leading US companies in this area, Amazon (AWS IoT TwinMaker), IBM (Digital Twin Exchange), Microsoft (Azure Digital Twin) and Oracle (IoT Digital Twin), therefore see numerous potential uses for the virtual representation of physical objects and systems. Primarily due to the wide range of possible applications and the associated market potential. After all, digital models of entire environments can be realized with this methodology. Such an environment can be individual drive elements, vehicles and machines, buildings or production processes, but also complete factories, company premises, complex energy supply networks, railroad lines, sports stadiums and even entire cities (smart city). Even much larger interconnected systems are possible. It can be deduced from this that there are also different digital twins with regard to the respective application.

Groups of twin instances

IBM took up the twin topic very early on and therefore already has extensive practical experience and now differentiates four different twin categories with the current state of the technology:

Component twins/part twins: They form the basic unit (digital twin for an entity) of complex digital twins and usually represent a functional component, such as the electric motor or frequency converter of a machine.

Asset twins: A composite system of at least two component twins that interact with each other, such as the functional components of a machine drive train. Here, relatively large amounts of data are already generated, which can be used to investigate the interactions and obtain usable information.

System or unit twins: In this integration stage, asset twins are combined to form a functioning overall system that can be represented as a star-shaped graph, for example. Such a digital twin for a machine transparently reflects, for example, the communication and interactions of the assets with each other and is fundamentally suitable for automated optimization decisions or cognitive capabilities.

Process twins: This category forms the macro level of detail for the cooperation of subsystems, for example for a complete production plant. In other words: a digital twin for a scenario. At this level, a digital twin can be used to answer complex questions: Are all subsystems synchronized? Do all systems have maximum efficiency or do delays in the system affect other systems? Such scenario twins help influence overall effectiveness and productivity.

In practice, a single physical object can be used for multiple twin variations. For example, the U.S. car company Ford uses a total of seven different digital twins for each vehicle it produces. The range of applications extends from vehicle design through production and operation to the customer experience.

In the environment of the German Edge Cloud, a working group has formed that wants to advance digitization in the area of industrial manufacturing with a network of three twins: A plant twin - thus a system twin - serves as a repository for all technical and electrical data, the circuit and wiring diagrams, functional descriptions for components and for the entire plant itself, as well as 3D models of the highest possible quality.

A product twin is used as the second twin variant. It enables product configuration and is used from customer request to delivery. It is created long before the actual product manufacturing and also contains 3D models. Sales partners and customers can use this twin instance to simulate and test product functions. Form, Fit and Function (FFF) can be tested via these 3D models.

The third representative in the group is the manufacturing twin. This process twin supports the product manufacturer in optimizing productivity by linking and evaluating sensor data from ongoing manufacturing operations with master data.

Appropriate tools required

The views and activities of cloud providers suggest that an application with digital twins can become a highly complex entity. In practice, it is not just a matter of coupling two or three component twins to form an overall data technology system and using it for information gathering, decision-making and for operational actions derived from it. Rather, applications with several thousand or even millions of individual digital twins are realistic, for example in the smart city example. In this respect, there is a need for universal and high-quality development tools, architecture concepts and data structures that do not create any (cloud) vendor lock-in.

Microsoft recognized this problem some time ago and developed the Digital Twin Definition Language (DTDL), a JSON-based description language that enables IoT data plug-and-play. DTDL is suitable for describing digital twin models of almost any complexity, i.e. from the simple component twin of a drive element to the highly complex scenario twin for a larger company site with numerous buildings.

Es wurde kein Alt-Text für dieses Bild angegeben.
Figure 1: The data interfaces of an IoT hardware for a component twin can be modeled using a description language, such as the Digital Twin Definition Language (DTDL): (1) A JSON object serves as the DTDL interface description. (2) The "telemetry" definition can be extended with a semantic type annotation via a "unit" property. (3) A data object matching the model is included in the output IoT sensor data stream. The goal in DTDL deployment is an IoT data plug-and-play to enable complex projects.


DTDL is based on JSON-LD, a recommendation of the World Wide Web Consortium (W3C) for the global linking of data and can thus be classified as a generally recognized industry standard. A basic DTDL building block is the metamodel classes, which are used to define the behavior of a digital twin and its associated physical object. The most important metamodel classes for describing behavior are Interface, command, component, property, relationship and telemetry. In addition, DTDL uses a JSON-based data description language, so that in the future, for example, a software development kit - as something like a "Digital Twin Integration SDK" - would be available as an accessory for each physical IoT data endpoint in order to also integrate it as a model in an IoT application via DTDL. DTDL also provides semantic type annotations, such as the meaning-based labeling of sensor data. This allows user interfaces, data analytics, and other data usage tasks in an IoT application to infer the semantics, not just the schema, of a digital twin's respective data. For example, the output data of a temperature sensor can be annotated as "temperature" by the DTDL model.


Es wurde kein Alt-Text für dieses Bild angegeben.
Figure 2: When developing IoT solutions for automation, data quality, data usage and system interconnectivity must play a key role. A particular challenge is communication protocol-independent data plug-and-play across different application layers. For this, digital twins based on data models with advanced semantic capabilities are a solid functional basis to avoid data-related media discontinuities.


Changed view and approach

An application developer can decide for his or her individual needs what the digital twin for a solution looks like, what capabilities such a virtual copy has and how these are pronounced in each case (see autonomy, intelligence, learning capability and data accuracy in part 2 of this article series). In the simplest case, only a few live data from various sensors are temporarily stored. It is also possible to add persisted design data from various CAD programs, technical manuals as PDFs, 3D models and numerous other data objects. All of this is then linked together via a website and is accessible to human users via a QR code directly on the machine. In any case, such a solution approach means that comprehensive device, machine or plant documentation exists, which is definitely helpful for ongoing operations. Whether something like this is sufficient in the global competition of digital capabilities will be decided by the market. However, a data flow for automated decision making using AI algorithms for a system that acts as autonomously as possible is usually not possible with such a solution. Even if the real-time data from the sensors is accessible to an external MES via an application programming interface (API), this changes little. After all, the MES would then has to be individually adapted to the real-time data image and this adaptation would have to be manually maintained throughout the entire life cycle.

The really important aspects of an industrial IoT automation solution are likely to be in the environment of data quality, data use and the capabilities of the system network based on these. This requires data models with advanced semantic capabilities based on open standards. Future applications should be designed as an open federation of individual digital twins, but with a high level of cybersecurity.


Click here to read the other parts of the Digital Twin series: ??

Part 1: Blueprint for IoT Applications

Part 2: The Digital Twin As An Add-On

Part 4: The Digital Twin in Practical Use


Author:

Klaus-Dieter Walter?- member of the management board at?SSV Software Systems GmbH.

Es wurde kein Alt-Text für dieses Bild angegeben.

is also known for his numerous presentations at international events, seminars, workshops and articles in technical journals. He has authored and co-authored several technical books and book chapters on the topics of embedded Linux, ARM-based microcontrollers, and the Internet of Things.

要查看或添加评论,请登录

SSV SOFTWARE SYSTEMS GmbH的更多文章

社区洞察

其他会员也浏览了