What’s your use case? “Send Everything!”

What’s your use case? “Send Everything!”

Earlier this week, one of our YouTube subscribers posted a question on a video—"How to get OPC data into Microsoft Azure—that we published back in October 2020 [time flies]. While it’s an old video, his point was relevant four years ago and it’s relevant today.?

The question:?

“This looks great and simple process for sending data to the cloud. While this approach is promising and straightforward, we need to consider real-world industrial scenarios. For instance, a single injection molding machine might generate over 300 data points per second. With 20 such machines, this translates to 6,000 messages per second, which could significantly increase cloud costs and configuration efforts. This is one of the use cases and issues that I am facing in my current journey and figuring out how can we optimize this process to minimize costs while ensuring data integrity and real-time cloud delivery? I would be very glad if you could recommend some solutions. Once again this is a great start and much more helpful. Thank you.”

With the last 4 years in perspective, let me walk you through what I commonly see.

1)?????? Customer starts by sending all raw tag data to the cloud with no context other than a tag name. They spend a lot of effort to do this and struggle with assembling the context in the cloud to enable use cases. The ROI to the business can't be measured, it's expensive, and the project dies.

2)????? Customer still wants to send everything, but realizes they need to contextualize at the edge. They model their assets and send everything to the cloud. Someone goes to use the data for application X, Y, Z and realizes the data they need isn't there, isn't in the right format, or isn't scanned fast enough, etc. Enabling use cases is still hard, there is a high upfront cost to modeling everything, and cloud spend is still high. ROI is hard to measure, cost is high, and the project dies.

3)????? Customer takes a use case-based approach. It could be energy monitoring, scrap reduction, etc. Customer creates the data model required by the cloud application and pushes it to the plant. The plant then maps the use case to the required data. There is immediate ROI, cloud spend is predictable, and the use case sticks and scales across factories. While doing this, plants start building out their asset model as needed to make connecting future use cases faster.

I think too often we as an industry treat cloud like factory historians, where we send everything "just in case" we need it. But cost structure wise that's not what cloud is and this approach leads to high effort/cost initiatives that don't deliver ROI and end up hurting manufactures and cloud vendors in the long run.

It's a long answer, but hopefully a useful perspective. The short answer is only send what you're going to use.

Ravi S

Director Marketing & Sales

4 个月

Are you saying dont send any data to cloud till one is clear about the use case and the model required for that use case ? If so how does he get the historical values to start with ? All use cases wont be available at the start itself ? Sorry not clear about what should be actually done and when ??

回复
Andreas Vogler

Technology Management at SIEMENS. SCADA, IoT, Databases, New Technologies and a passion for Open Source.

4 个月

Insightful, thanks. What I don’t get: why is it a problem to just send a system-id(or machine-id) plus tag, time, value? If I publish the data to a database (in that case in the cloud), then it is easy to create lookup tables for contextualization and metadata. No need to transfer this with every record. Is it more a organizational problem? Because this is done by IT and they need the information from the OT?

Omar Aziz Ahmed

Empowering the Largest Industrial Companies on the Planet to Innovate.

4 个月

Great and useful content.

Philip H. Le

Data Science Lead @ CooperVision | Ph.D | Analytics Leadership & Machine Learning

4 个月

Insightful, the use case-based approach is user/customer focused hence it works much better than moonshot or grand infrastructure projects. However, many organisations still haven’t got a Data team ( governance, data management, etc). There is a need to upskill current IT/OT teams to understand what is required for this journey.

要查看或添加评论,请登录

Aron Semle的更多文章

社区洞察

其他会员也浏览了