For all those conversations ending with historian vs Industrial IoT - Part 1/2


History has a tendency to repeat itself. As memory fades, events from the past can become events of the present. That’s why preserving the memories matter and more importantly learning from history even matters more.

This is only when you could see the history, you can be in present while guessing how the future might look like with confidence

In manufacturing industry for decades we have been using historians to preserve the manufacturing memories or events. As we all know, technically historians are tuned for only temporal ( aka time series ) data. Theoretically any data including a bit map image can be represented as time series or temporal data but with a lot of overheads and post processing before they could end up being stored as memories within a historian. The same over heads apply also when we try to retrieve the converted temporal data for our consumption. This data conversion overhead makes historians preferable for data from other L1/L2 automation systems and data loggers dealing with pure time series data only.

Historians are excellent in compressing the large volume of data stream coming from the underlying factory automation systems and data loggers. Historians employ a number of algorithms or the methods to compress data while maintaining the data fidelity.

For example - if a sensor is sending reading at X readings/second ( let us call it raw data ) , historian might end-up archiving Y reading/second ( let us call it archived data ) . The number of points archived is lesser than number of points received from sensors.

The algorithms / methods within Historians make sure that data fidelity remains unchanged between the time series graph plotted using raw data vs. the archived data.

This means essentially

  1. The end users get the true sense of changing pattern within the time series data without having to deal with a huge data volumes (storage)
  2. The latency and complexity in plotting of the trends due to fewer data points to render as against the raw data plotting.
  3. Above two also means a very low maintenance architecture and hence

Most Historians come with a no coding philosophy for building basic trends related dashboards just by drag- drop, excel based configuration tools. These basic dashboards can be developed and maintained by factories themselves.

The time series are useless in manufacturing unless it is put into context of 6 pillars of manufacturing, i.e. man, machine, material, methods,SOPs and energy.

With historians, even if you have captured the manufacturing history but they were never designed to provide the Manufacturing Context. Historians are fundamentally designed to be the data provider and have an eco-system play along with other IT/OT manufacturing systems like MES, LIMS and in some cases ERP to create that Manufacturing Context

So, where is the problem ? You have Historian, you have MES and/or LIMS and of-course you have ERP. Why cannot Manufacturing Context can be created and leveraged for value generation


Many a times,
the answer lies in the problem itself.

You have multiple systems and that is the problem. You have the manufacturing context in the context of the system you are working with. ERP are primarily designed for giving an accounting perspective to the manufacturing. MES are primarily responsible to give a traceability perspective to the manufacturing. and LIMS primarily touches upon the quality assurance perspective.

These manufacturing context though exist but are all scattered. There is no single place for you to have unified manufacturing context across your value stream at an individual batch/ product/ SKU level end to end, as the batch moves from one work centre to another.

Interestingly, all those systems mentioned above might be connected to your historian in some form or the other at some point in time. All these integration is kind of "hard wired" most of the time. That means over the period of time as your business grew and manufacturing capability got enhanced, you keep on extending the historian time series data to the systems that asked for it.

No brainers. Connect to the historian, get whatever and do whatever.

Next time when you need something else, just repeat the loop. Connect to historian, get whatever and do whatever. And this goes on and on and on and forever.

By this time, you might have an impression that Historians can be "the" solution for all your needs. That's the one size that can serve all your data purpose.

"Historian only" approach works best when

  1. You are consciously taking a decision to become manufacturing context silo-ed
  2. Ready to be repeating the loop very time, whenever you need some set of data from historian to be integrated with your enterprise IT/OT systems.
  3. Data is achieved only for the fear of not losing it and for compliance purpose. May be much like a cockpit voice recorder or a black box in an aircraft.
  4. Analytics is just limited to trending of time series data and derive insights based on experience and intuition

This historian centric approach may not work when the focus is on

  1. Building an innovative demand driven Value Stream - that can change shape and form with time and with agility. This can only be achieved if you have information on how your value streams react to change in demand profile and evolve over time. The only way to achieve this is to build an end to end data model for your manufacturing across the value stream. This requires a platform that can ingest high volume of data, efficiently process the same and most importantly must not be "hard wired" , should be capable of connecting to each and every data that is of interest to manufacturing and also not of interest to manufacturing at this point in time.
  2. Batch Size of One - One of the fundamental characteristics of Industry 4.0 is a value stream that can cater to Batch Size of One or other wise called as "Mass Customisation". Clearly it requires your machines to communicate with each other and re-configuring rapidly to accomplish that "Batch size of one". This also means all machines participating in producing that "single SKU batch" must know the activity it needs to perform, re-configure it accordingly. It also be aware of the upstream and downstream activity it needs co-ordinate with. Needless to say, this implies manufacturing data needs mashing up with IoT data from supporting services like EHS, Logistics and Supply Chain and it requires some application and system much bigger, faster and technologically advanced system than Historians are by design to augment the current generation of Historians ( which were designed and coded in the pre-big data era )
  3. Physical to Digital to Physical (PDP) - Yes, you read it right. Industry 4.0 is also about converting the physical world to a digital model and then use those digital model to understand the physical behaviour of the objects to fine tune the physical objects further, For example - you have a sensor on your car that your manufacturer has fitted. This is called OBD. This OBD can create an entire digital model of your car including the driving pattern. This information when coupled with your demographics can provide enough information to the designers of the vehicle to further fine tune the physical performance characteristics of the car in the next release version may be. This complex PDP interaction at some point in time intersects with the manufacturing value stream and that's where it requires some application and system much bigger, faster and technologically advanced system than Historians are by design to augment the current generation of Historians ( which were designed and coded in the pre-big data era )
  4. Manufacturing with HITL - Building an enterprise data platform for running any sort of analytics starting from continuous monitoring of process variables for automated process monitoring to highly sophisticated analytical models (AI) enabling running of plants in auto-pilot mode where humans are only in loop. When we are stretching our imagination far and looking at a future where everything is sensorized and everything is automated, this of-course requires data to be handled processed and managed at peta byte scale and that's where it requires some application and system much bigger, faster and technologically advanced system than Historians are by design to augment the current generation of Historians ( which were designed and coded in the pre-big data era )
Well, now you are confused.

Age old historian you thought was "the" only data platform you needed. A little far into future things are looking interesting and looking at the pace with which the technology is progressing and manufacturing is adopting them these does not look like sci-fi anymore.

So, what's the way forward, do I need Historian or not - be straight ? How should I augment my existing historian(s). I have not invested in any historians, is it the high time to get a historian ? Shall I invest in some Industrial IoT platform in place of a historian ? IoT means lot and lot of data but then do I still have to go for a historian if I am subscribing to some IoT platform ? Data lake is cheap, can it not be used as historian ?

If all those questions have now popped up, then keep watching the part 2 of this blog post where I will answer all the question with real life examples.

PS : I love Historians. I have experience in implementing a bunch of them under variety of constraints. This article is my own view on why ( and why not) to discard historian from your OT landscape.

[The views expressed in this blog is author's own views and it does not necessarily reflect the views of his employer, Wipro Limited]






Rajesh Kavathekar

Sr. Director, Engineering Services Europe with OT, IOT background. Base location is London.

4 年

Great piece! Keep up the POV writing

要查看或添加评论,请登录

Prangya Mishra的更多文章

社区洞察

其他会员也浏览了