Converging the worlds of OT and IT
How do you bridge the divide between Operational Technology (OT) and Information Technology (IT) in industrial systems? In a recent post on LinkedIn, I mentioned how I was working on my largest industrial controls project to date and touched on the convergence on OT and IT, namely cloud data platforms.
That post sparked several direct messages to me about how I’ve integrated the data from PLCs into a big data solution, so I thought it was worth an article to look at this in a little more detail.
History
Firstly, I want to write about how I got to this point as there’s been a few of lessons learnt. Four years ago, my team and I were approached by a client to build a solution that allowed them to monitor and control some industrial energy assets. With an extensive background in software development and Microsoft Azure we naturally took a software-based approach to the problem. We built a solution around Azure IoT Edge with custom C# modules. We developed modules for Modbus devices, Analog IO (4-20mA), Digital IO and a module for the asset optimisation. We additionally used Azure Stream Analytics on the Edge for some data aggregation tasks.
For those unfamiliar with Azure IoT Edge, it is basically an abstraction of docker that allows you to author and deploy custom modules, while taking advantage of features of Azure IoT Edge such as security.
The diagram below shows a high-level view of Azure IoT Edge.
While the software approach was a success there were a number of issues we faced over time, causing some occasional down time of the assets. While often quick to resolve, the downtime could cost the client several thousands of pounds per hour. As the number and complexity of the sites grew we knew we needed a more robust solution.
Trial and error ?
The increasing number of deployments and site requirements provided us with an opportunity to rethink our solution architecture. We needed to increase operational resilience while also extending the capabilities of the solution. We decided the best way to meet both of these objectives was to introduce a PLC to the architecture. The PLC would be responsible for communication with the assets and some control, while the edge compute would continue to provide advanced optimisation of those assets and data orchestration with the cloud. We decided to choose the Siemens S7-1200 range of PLC’s based on their availability and low cost but the trade off was a small amount of program memory – something most of us programmers don’t have to think about too much in the age of cloud computing and workstations with enormous amounts of hard drive space and memory.
Enter the world of converging OT and IT. The first major challenge was how do we read and write real-time data from the PLC? While there are third party solution that would have allowed us to do this, licensing costs, especially with the backdrop of increasing number of sites made this prohibitive.
We first settled on hosting an MQTT broker on the Edge compute and having the PLC publish data to the MQTT broker. Additionally, the PLC would subscribe to some topics, enabling the algorithms on the edge to exchange data/ instruct the PLC. Essentially we'd structured it to be a mini Unified Namespace.
Lesson 1 - Choose the Right Tool for the Right Job
Despite something being possible on a PLC, it doesn't mean you should do it! Offloading complex operations to an edge compute device rather than pushing these onto the PLC preserves memory and improves efficiency.
PLC’s love to handle numbers and bits and they excel at this but when it comes to handling strings, things quickly start to fall apart. To construct a JSON string to send to the MQTT broker took an large number of instructions. Not only does it become slow to execute and complex to manage, it takes a large amount of that tiny program memory you have available. I quickly used all of the memory available on the baby S7-1200 and had to purchase a higher end S7-1200 to continue working.
I realised this wasn’t going to scale beyond a couple of assets, if I was lucky so I needed to find a new approach.
Lesson 2 - Plan for Scalability from the Start
Small design choices, like relying heavily on limited-memory devices, can limit scalability and flexibility as your system grows.
Final solution
The second iteration, which is the iteration we have now successfully deployed, is scaling, is easier to maintain and doesn’t require non-standard programming. I settled on using the PLC’s inbuilt OPC UA server. I’d initially been put off by OPC UA due to a number of factors but for what I needed to achieve it’s actually been the perfect companion.
On the edge compute I no longer need to host the MQTT broker and instead the various IoT Edge modules are now integrated with OPC UA capabilities. The module’s are config driven using the modules twin settings, meaning that we can read just the data we need from the each PLC. One module might only require one value from a PLC, while another module might require multiple values from multiple PLC’s. This is all possible with this approach.
Lesson 3 - Embrace Industry Standards for Compatibility
While custom solutions, like MQTT with a PLC, might seem feasible initially, industry-standard protocols such as OPC UA can offer greater reliability and long-term compatibility.
The diagram below shows the overall architecture of the solution.
The outcomes of this project have been transformative, not only for operational efficiency but also for the client’s bottom line. By combining the robust control capabilities of PLCs with the advanced data processing and analytics of edge computing, the new architecture will drastically reduce system downtime, which was previously a costly vulnerability. This hybrid approach has also enhanced scalability, allowing for seamless deployment across multiple sites without significant development effort.
Data flows more reliably and securely from the operational layer to the cloud, enabling real-time insights and more accurate predictive analytics. As a result, the client can now optimise asset performance, reduce maintenance costs, and make data-driven decisions with greater confidence, demonstrating how converging OT and IT can create a resilient, future-ready solution in industrial settings.
Lesson 4 - Focus on Resilience in High-Availability Systems
In environments where downtime is costly, reliability is paramount. Shifting critical control functions from software-based systems to PLCs, which are built for robust, real-time operation, helps ensure that essential controls remain online, even when network or edge compute failures occur.
As OT and IT continue to converge, what challenges and solutions have you encountered? Let's connect and share insights!
Glossary of Terms
?? Helping Business Owners Secure Funding | CEO @ Stealth | President @ Rockcliffe Capital | COO at Wovu AI | Growth Strategist & Revenue Expert
3 个月Great insights, James! How did you navigate the transition to a hybrid architecture? Any specific challenges you faced that others should keep in mind?
AI & IT Consultant at ANAGHA AGILE SYSTEMS
3 个月Yes that's the way into making industry 4.0 businesses effective and profitable. ??
How have you tackled this?