OT Standards 20 Years Ahead of IT?
ICT and IT are not the same

OT Standards 20 Years Ahead of IT?

Imagine getting data from your DCS, SIS, MCC, machinery protection system, and others into the analytics software without custom drivers. Imagine being able to add a new app or replacing any software application without getting new drivers for each subsystem. Imagine not struggling to obtain new versions of those drivers every time there is a new version of Windows. Well you can; thanks to IEC62541 (OPC-UA). If you think “IT standards” is the answer, think again. The most interesting fact is that there is no equivalent to OPC-UA in office/business systems, as discussed two years ago. Since 14th October is the World Standards Day and since automation standards set a shining example for digital interoperability and interchangeability I figured I’d again take this opportunity to celebrate those that develop these standards. Since many plants are now investing in Digital Operational Infrastructure (DOI) for Industrie 4.0, digital transformation, and IIoT, I will share some best practices for standards-based digital automation system software and data architecture design for faster and lower cost of system deployment, low risk of obsolescence, greater flexibility to add and change over time, and lower ongoing support cost. And to avoid the risks of proprietary protocols, APIs, and web services. Here are my personal thoughts:

IT Not the Same as ICT

There still is a perception that IT is more open, flexible, and less prone to obsolescence than OT. Yet, we know that in reality a software module (e.g. accounts) from one brand ERP system does not speak with a software module (e.g. supply chain management) from another brand ERP system without extensive custom programming, so clearly there are no standard protocols, APIs, or web services for ERP systems. ERP systems are not open. This is what makes integration or migration of ERP systems from newly acquired companies into the new mother company’s ERP system so time-consuming and costly. We know there are no purpose-built, user-friendly, expense claim apps that can simply plug into any ERP system because the APIs and web services are proprietary so not flexible. It is too expensive to make the ERP deliver what we want so we still use spreadsheets anyway. And we know MS Office files don’t render well in Apple iWork and vice-versa because there is no standard file format. Webex, MS-Teams, Zoom, and GoToMeeting don’t talk to each other because they don’t use open standards. So why this mismatch between perception and reality? Why do so many perceive IT as open when it isn’t? How do we reconcile the belief that IT systems are open with the fact that they clearly are not? The answer came to me when I was watching the Cisco single pair Ethernet "Master Series | Back to the Future" video. I realized there is confusion between IT applications and the underlying Information and Communications Technology (ICT) standards. The video highlights IT is the subset of ICT used in business/office applications. OT is the subset of ICT used in plant automation. Personal and entertainment is another domain. There are many underlying ICT standards such as Ethernet, IP, TCP and UDP, HTTP, and HTML, one on top of the other shared by all these application domains. However, the IT applications like the business systems and office software suite on top of the ICT standards are proprietary. With an IT-like approach of custom programmed APIs and web services your level 3 (L3) systems could end up far more proprietary than your level 0 to 2 (L0-L2) systems, just like your level 4 (L4) ERP system. You don’t want your DX platform to become another ERP system that costs millions and takes years to deploy and cost millions to support each year.

Whenever we say “IT” we often mean “ICT” or “digital” in general

Best Practice Digital Standards Stack

Digitization will require you to integrate the various systems in your plant not yet connected up to L3 so you can access the data you need in them for analytics and visualization. So how do you ensure your L3 systems do not end up as proprietary as your ERP? Automation engineers specify standards all the time for mechanical and electrical products, such as for pipe flanges and electrical switchgear. Standards is what makes these things fit and work together, seamlessly. It can be international standards like IEC and ISO, or national standards like ANSI or DIN. Standards must be specified for digital products too, software and hardware components, or else the cost and effort for custom integration of components and to keep them working together as a system will be very high. There are several bands of standards in the technology stack for the Digital Operational Infrastructure (DOI) required for digital transformation (DX):

  • Bus communication protocols
  • IP communication protocols
  • API / web services
  • File format
No alt text provided for this image

Standards simplify and decouple the architecture components, which enables each component to evolve independently. These plant automation technologies are all operational technologies, so IT folks cannot help that much, but the I&C engineers are experienced with these technologies that ride on top of the common ICT technologies. The plant may of course also have many subsystems which are not standards-based. There may be field instruments that use 4-20 mA but not HART, vibration monitoring systems that use IP and Ethernet but no standard application protocol, and software that have “exposed APIs” but which are not standards. In these cases you have to do your best; for instance it may be possible to fit them with an OPC-UA server. I&C engineers have a knack for figuring out ways of integrating systems. The important point though is to not invest further in proprietary technologies where there are standards available.

Standard Bus Protocol

Non-IP bus protocols used at level 1 (L1) for sensors at level 0 (L0) include both wired fieldbuses and Wireless Sensor Networks (WSN). Wireless sensors are a critical part of the Digital Operational Infrastructure (DOI) for digitalization towards Industry 4.0. There are many WSN technologies available. But, many of the WSN are proprietary, so be careful to not invest in the wrong technology. If the protocol does not have an IEC (or other standards body) document number associated with it, it is proprietary. But for WSN, take care with radio-only standards. For instance, IEEE 802.15.4 is a great radio standard for a WSN, but it alone is not sufficient because it is not an application protocol. So a WSN which only states IEEE 802.15.4 and no application protocol standard, most likely has a proprietary application protocol and doesn’t interoperate.

The WSN is the enabler to fully instrument process equipment with advanced sensors. When laying the foundation for your DOI, a WSN infrastructure based on IEC62591 is probably the best way to go as explained in this WirelessHART essay.

Standard IP/Ethernet Protocols

The IP network protocol (IETF RFC 791) standard over standard Ethernet (IEEE 802.3) is in use at level 2 (L2) and above since more than 20 years ago. With the future Ethernet-APL this will extend down to L1 for field instruments at L0. But for IP/Ethernet, take care. Ethernet and IP alone are not sufficient because you also need an application protocol. So networks which only states IP and IEEE 802.3 Ethernet and no application protocol standard, most likely has a proprietary application protocol and doesn’t interoperate. Lots of IP/Ethernet protocols are proprietary, so be careful to not invest in the wrong technology. If the application protocol does not have an IEC (or other standards body) document number associated with it, it is proprietary. And also for IP and Ethernet, take care with messaging-only protocol standards. For instance, MQTT, AMQP, and CoAP messaging protocols alone are not sufficient because they are not full-fledged application protocols; every vendor has a different information model, parameters, and data types. Even when exchanging JSON messages; vendors use different formats so it still requires custom programming to make things work together.

To provide full interoperability and even a degree of interchangeability you need to use full-fledged application protocols with standard information model, data types, device description, and device profiles such as HART-IP (IEC61784-1), PROFINET (IEC61784-2), FF-HSE (IEC61784-1), and to some degree OPC-UA (IEC62541). Most likely you will end up using all four of these protocols and more since Ethernet supports multiple protocols to provide a single IP-everywhere and Ethernet-everything signal cable infrastructure. Your I&C engineers can advise you which protocols are used in your plant.

Standard API and Web Services

Software applications exchange information across an Application Programming Interface (API) or using Web Services (WS). When software documentation speaks of “open” or “exposed” API and Web Services you must be very careful. If there is no name and no IEC standard number the API/WS is likely proprietary, owned by the publishing company and comes with some restrictions and limitations. A consultant integrator may encourage programming custom software based on somebody’s exposed APIs and web services but which could lock you in. And there is a risk the API/WS owner/publisher may make breaking changes to the interface that break integration of one or more of the applications you use in your system when one of the other applications are upgraded. “Open” or “exposed” API and Web Services are common at L4 business systems such as supply chain management between companies, but the consequences could be serious for production. In other industries like FinTech which operates at L4, there is even a term “API economy” which refers to getting revenue from exposed APIs. Therefore that approach has risk and is less common at L0-L3.

OPC-UA (IEC62541) is the standard API used in automation software. It supports real-time live streaming data (DA), alarms and events (A&E), as well as historical data (HDA) access. Therefore, use automation software based on OPC-UA. OPC-UA is a key technology in the NAMUR Open Architecture (NOA) interface as defined in the NE175 recommendation for the overall interface architecture between the Core Process Control (CPC) system and the Digital Operational Infrastructure (DOI). NOA is the best practice architecture for deploying the DOI.

No alt text provided for this image

The forerunner of OPC-UA is OPC Classic including OPC-DA, OPC-A&E, and OPC-HDA still in use today. OPC-DA was created in 1996. There is no equivalent for ERP applications. That is why I personally believe OT is more than 20 years ahead of IT in standardization and open systems.

With OPC-UA the historian can be decoupled from the DCS which enable DCS and historian to evolve independently. That is, the historian is not tightly linked to the DCS. The DCS and historian need not be the same brand. The historian can be a different brand from the DCS, meaning a single historian can connect to multiple underlying DCS and other systems of different brand which is often the case in plants.

Standard File Format

A standard file format makes it possible to save data in a file from one system, transfer the file, and then open that file on another system. The file format depends on the type of data being saved. Well known file formats include HTML5 for web page user interface, JPG (ISO/IEC 10918) for photographs, MP3 (ISO/IEC-11172-3) for audio, and MP4 (ISO/IEC 14496-14) for video. These were developed by industry expert groups; the World Wide Web Consortium (W3C), Joint Photographic Experts Group (JPEG), and the Moving Picture Experts Group (MPEG) respectively.

There are little or no standards for file formats in the process industries. Applications natively store data in proprietary file formats not understood by other applications. This includes for instance vibration waveform, spectrum, and orbits, chromatograms, calibrator routes, and calibration as-found-as-left records. Most applications can export some data in Comma Separate Value (CSV), text (TXT), or Extensible Markup Language (XML) format. However, only certain types of data such as time-series trends, logs, and table lists lend themselves to export in these formats. Moreover, there is no standard for what information goes into which column and how data such as date, time, and numbers are represented and labelled. This means manual work is always required before or after importing data into the next application. Some meta data may be excluded. Nevertheless, CSV and TXT files can be imported into spreadsheet software like Excel for presentation, analysis, or reporting so exporting to these file formats is a feature you want to look for in software. HTML5 web-based user interface is increasingly the technology of choice for presenting information to users.

More DX Standards Needed

That’s all for my suggestions. If you have a few more minutes you may wish to read on to see my personal opinion on how the industry could become even better with common file formats for porting data between applications and some protocol extensions. Standard file formats much like HTML5, JPG, MP3, and MP4 would have to be developed by industry expert groups for their respective area. That is, organizations specializing in vibration, oil analysis, chromatography, instrumentation, and calibration would have to write these file format standards. The know-how is very specific. Standard file formats would allow you to port data from one vendor’s product to another. Similarly, these organization could develop protocols, or extensions to OPC-UA, optimized for these specialized forms of data. Specialized real-time data streaming protocols for vibration waveform, chromatogram, and others would probably be based on OPC-UA but with profiles for the various forms of data to stream live. Specialized file formats for vibration data, chromatogram, calibration data, and others would probably be based on XML with profiles for various forms of offline data.

Vibration: The API670 Machinery Protection System (MPS) standard specifies the Modbus communication protocol for vibration, position, phase, piston rod drop, speed, and temperature measurements. These measurements are not sufficient for advanced vibration analytics. The Modbus protocol is supported in many systems so it does enable some connectivity. However, there are many limitations with Modbus such as having to manually map data to registers, lack of structured information model, and the fact that implementation of the protocol such as data type is different from one vendor to the next. API670 does not specify communication of the raw vibration waveform required for advanced vibration analytics. Machinery protection and prediction systems from leading automation vendors already support OPC-UA for vibration, position, phase, piston rod drop, speed, and temperature measurements making those variables easy to integrate as analytics from leading automation vendors also support OPC-UA. However, live streaming of the real-time vibration waveform is done using proprietary protocol. Therefore the vibration monitoring software must come from the same vendor as the vibration monitoring hardware. This is true for both MPS and prediction systems. As plants are deploying additional vibration online monitors and vibration transmitters from multiple vendors they have realized they don’t integrate into the same software as their existing MPS.

What is missing is a standard protocol for live streaming of the real-time vibration waveform. This could be an extension of OPC-UA. This extension would need to be defined. This would enable you to dynamically see the raw waveform and visualize the processed spectrum, orbit, waterfall chart, and cascade plot etc. from all kinds of vibration sensing systems in the same software. The workaround for the moment is to deploy a prediction system in parallel with the MPS, connected to the buffered outputs of the MPS.

Since OPC-UA is also a software API, the same extension would allow vibration data to be streamed from one application to the next, such as from first processing of waveform into spectrum, orbit, waterfall chart, and cascade plot in one app, and then passed on to a second 3rd party app for higher-level advanced analytics on that processed data.

Files are used for offline transfer and storage of vibration data such as from a portable vibration tester into the vibration analytics software, compare change over time, or when you want to send the data from a tester or online system for analysis by an expert in another location. There is no standard file format for vibration data. Storage of vibration data such as waveform, spectrum, orbit, waterfall chart, and cascade plot are done using proprietary file formats. Therefore the vibration data saved by one software can only be opened in that same kind of software. As plants are deploying additional vibration online monitors and vibration transmitters from multiple vendors they have realized they don’t integrate into the same software as their existing MPS and portable testers.

What is missing is a standard file format for vibration data. This could be an extension of XML. This extension would need to be defined. This would enable you to see the raw waveform and analyze the processed spectrum, orbit, waterfall chart, and cascade plot etc. from all kinds of vibration sensors in the same software.

Lube oil analysis (OA): can be on grab sample offline or by permanent sensors online. Measurements depends on the analyzer or sensors used but may include wear metals, additive (depletion), viscosity, water content, acid/base number, soot, other contaminants, and particle count. For online systems these can be communicated using OPC-UA.

What is missing is a standard file format for transferring offline sampled data such as from an analyzer to a Laboratory Information Management System (LIMS). This could be an extension of XML. This extension would need to be defined. This would speed up and simplify work.

And there are all kinds of other bench top analyzers that can benefit from standard protocols and file formats.

Chromatography: A gas chromatograph (GC) measures the concentration of multiple components and heating value. GCs from leading automation vendors already support OPC for these measurements making those variables easy to integrate. However, the chromatogram is communicated using proprietary protocol (or proprietary function codes of Modbus). Therefore the chromatogram viewer software must come from the same vendor as the GC hardware.

What is missing is a standard protocol for the chromatogram. This could be an extension of OPC-UA. This extension would need to be defined. This would enable you to view chromatograms from GCs from multiple vendors in the same Analyzer Management And Data Acquisition System (AMADAS) software. You could bring the chromatogram into 3rd party analytics.

Files are used for offline transfer and storage of chromatograms such as when you want to compare change over time or send the chromatogram for review by an expert in another location. There is no standard file format for chromatograms. Storage of chromatograms is done using proprietary file formats. Therefore a chromatogram saved by one software can only be opened in that same kind of software.

What is missing is a standard file format for chromatograms. This could be an extension of XML. This extension would need to be defined. This would enable you to see chromatograms from GCs from multiple vendors in the same viewer software.

And there many other such sensors such as 3D solids level that would benefit from standard protocols and file formats.

Calibration: The calibration route (ordered list of devices to be calibrated) should be downloaded from the calibration management or Intelligent Device Management (IDM) software into the documenting calibrator before the instrument technician heads out into the plant. Calibrating a transmitter includes applying inputs and capturing “before” (as-found) readings, then applying the correction trim, and lastly again applying inputs and capturing “after” (as-left) readings. Once back in the office, the as-found-as-left data calibration data shall be uploaded into the calibration management or IDM software. There are no standard file formats for transferring the calibration route or the as-found-as-left calibration data.

A standard file format could be an extension of XML. This extension would need to be defined. This would enable you to use any documenting calibrator with any calibration management or IDM software.

Simulation Model (Digital Twin) Components: A process model, sometimes referred to as a digital twin, is a model to simulate a process or machine. It is used for process simulation to try out control strategies to predict closed loop behavior to optimize control to achieve a certain objective, or in Operator Training Simulators (OTS) for control room operators and Virtual Reality (VR) for field operators to practice manual intervention to process upsets or for unit startups and shutdowns etc. The model is also used to compute target energy consumption based on production rate for efficiency and loss control. The model is most often built manually based on first principles (1P) physics, chemistry, and mechanics. Doing this for each piece of equipment in the process is a lot of work, although templates for common equipment types help.

From an earlier part of my career as an analog electronics designer I remember the Simulation Program with Integrated Circuit Emphasis (SPICE) to predict circuit behavior. In the software you load a CIR file for each type of IC in the circuit. The CIR file describes the internal circuitry in the chip to the software in a standard format so it can simulate it as part of the larger board-level circuit. IC manufacturers provide CIR files for their chips. Perhaps the same concept could be used for process digital twins. Manufacturers of process equipment could provide files with the model for their pump, compressor, cooling tower, air-cooled heat exchanger, heat exchanger, fan/blower, or control valve etc. This file could then be imported into the process modeling software to become part of the overall process model. Process licensors could do it for entire process units. This could save time and reduce errors, making digital twins more affordable and widespread. Keeping the digital twin up to date with changes could also become easier. What is missing is a standard file format for digital twin component parts. This could be an extension of XML. This extension would need to be defined.

Invest in Digital Standards

You want your automation to be ICT-like, not IT-like, so be careful what you wish for. Go for a standard technology stack that includes WirelessHART, HART-IP, OPC-UA, and HTML5. Schedule a meeting with your I&C / digital transformation teams for October 14th World Standards Day or today. Forward this essay to them now. And remember, always ask for product data sheet to make sure standards like WirelessHART, HART-IP, OPC-UA, and HTML5 are mentioned. These are key to the Fourth Industrial Revolution (4IR). Well, that’s my personal opinion. What automation standards would you like to see to make your work easier? If you are interested in digital transformation in the process industries click “Follow” by my photo to not miss future updates. Click “Like” if you found this useful to you and “Share” it with others if you think it would be useful to them. Save the link in case you need to refer in the future.

Heiko Fessenmayr

R&D LabInformatics System Architect at Agilent Technologies

4 年

For Chromatography data there’s an emerging data standard ADF defined by the Allotrope Foundation. See Allotrope.org . Agilent OpenLab CDS is able to export LC data to ADF and such data can be read with other ADF viewers. Allotrope provides an high-Level ADF reader API and a basic viewer for LC ADF data.

Pedro Perdomo

Industrial Automation | Project Management | I&C | Digitalization | Value Creator |

4 年

Interesting. Integration/interoperability has been in the top of the automation world from its genesis. Neither OT/IT has gone deep into a solution. And they will not unless final users really demand it. Still we don’t have a promised universal IO i.e. An easy one, still we don’t have a programing code we can interchange . Etc etc .

Cleber Santos

Industrial Systems Cybersecurity Consultant

4 年

Great article! Thanks for sharing the knowledge!

回复
Bill Lydon

Digital Manufacturing Transformation Consultant - Manufacturers are at a pivotal tipping point requiring Digital Manufacturing Transformation to succeed and prosper or become non-competitive.

4 年

Jonas Berge, you need to get out more in the IT world. Applications are not portable in the OT industry: CANNOT Build a controller application one time and deploy to any controller platform. (ROK, Schneider, ABB, Siemens, Honeywell, Emerson,…) CANNOT build an HMI application and deploy to any automation platform (ROK, Schneider, ABB, Siemens, Honeywell, Emerson,…) CANNOT build an alarm application and deploy to any automation platform (ROK, Schneider, ABB, Siemens, Honeywell, Emerson,…) Don Bartusiak, Chief Engineer at ExxonMobil has many analogies characterizing the issues, this is one: If you switch to your computer from a Dell to an HP or Lenovo would you accept that you have to rewrite all your Word documents, spreadsheets, presentations and other documents? The Open Process Automation Forum is working on setting new standards for open multi-vendor interoperable automation & control systems. They are coordinating with NAMUR, OPC Foundation, PLCopen, and other groups. DCS suppliers are much like mainframe companies of the past such as IBM, Burroughs, Univac, ....?? DCS vendors have the opportunity to be on the right side of the automation technological advance to develop an open multi-vendor interoperability ecosystem.

回复
Bill Lydon

Digital Manufacturing Transformation Consultant - Manufacturers are at a pivotal tipping point requiring Digital Manufacturing Transformation to succeed and prosper or become non-competitive.

4 年

OT standards are significantly behind in technology and the maturity of vendors to work together in independent open organizations to achieve seamless multivendor interoperability for users. Instead of open ecosystems the OT industry has vendor concentric gated ecosystems making it expensive and difficult for users to build cohesive multivendor systems. Worse yet this situation creates lower performing systems and lower reliability.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了