SSOT – the holy grail?

SSOT – the holy grail?


People keep asking me why SSOT (single soruce of truth) is such a "fetish" for me!

The answer is simple:

  • SSOT is the only way to prevent large, dynamic data sets in various data silos from developing into an inconsistent, invalid, increasingly unusable pile of data.
  • I would even go so far as to say that the 4th Industrial Revolution has become necessary, among other things, because the SSOT principle has not been used enough in the past.

?But why is that?

The problem arises insidiously due to the so-called “flat copying” of data. Every time valid data is copied (also exported), it has to be decided (actually) which of the two data sets may be changed in the future and thus reflects the current status valid from now on, the other data record has to be frozen and forms the history.

The good news is that the SSOT principle is now implemented in almost all systems (ERP, EMS, GIS, SCADA, asset management systems, ...) that are based on relational or object or graph databases. In these systems, flat copying is replaced by deep copying (smart copying), which preserves the consistency of the data.

However, people today work with up to 30 different apps, all of which have their own data models, and copy data (flat) from one app to another.? Although this solves the task at hand in the short term, it leads to the "rotting" of the database in all applications in the long run. Simply put, this means that soon no one will know where the source of truth currently is.

The perfidious thing about it is that the rotting of the data proceeds very gradually, as the inconsistency only arises with the next change on one side or the other. By analysing data silos at TSO/TSO companies, we have found that the data situation is already precarious today.

But how can the problem be solved?

1.????? There is a leading system or data source that contains all the data and ensures the consistency of the data.

2.????? The various data sources are linked to each other via interfaces (at business object level), so that the individual apps are each responsible for a part of the data and provide all the others with up-to-date, correct data.

Solution 1 has been tried again and again since the 1980s with a lot of effort, but works poorly in practice for the following reasons:

  • The demands of modern specialist applications on their data are very high, and data structures are very complex. It is impossible to manage this subject-specific data in third-party systems that do not "understand" this data. As a result, hidden data silos continue to form alongside the leading system.
  • In the very dynamic software world, definitions are sometimes years behind current developments. This also leads to the formation of other "temporary" silos in addition to the main silo.

The I4.0 Solution

Solution 2 looks more complex at first, and can only be carried out with the help of experts from the respective solution providers, but in my opinion it is the only option that works in the long term. The development of bilateral, bidirectional standardized interfaces has been taking place between different software developers for years. The VDE ETG Digital Twin Working Group has now set out to develop proposals for a general approach based on I4.0 “Asset Administration Shells (AAS)” based on its experience with these interfaces. Another task that the working group has set itself is to develop solutions for the validation of already rotten data silos, there are already successful examples of asset data, but proposals for further silos still need to be developed.




#substation #digitaltwin #ai #i40 #primtech #bim #digitaltwin #ssot #ass #cigre #ieee #hvac #hvdc #radicalDigital #generationZ #GenerativAI, #GenertivDesign #ifc #bcf #scan2bim #lcm #digtaltransformation


要查看或添加评论,请登录

Wolfgang Eyrich的更多文章

社区洞察

其他会员也浏览了