What about the ‘Little Data’ we have? - Part I
Chaiitanya Bulusu
Unlocking Asset Potential with RCM | Business Head and SVP -AMERICAS
Most discrete part makers are persistently questing for approaches to improve profitability. While some may follow a methodology dependent on gut feel and intuition, it is more productive to put together operational improvement endeavors with respect to hard realities. Achievement of this objective is frequently hindered, nonetheless, by a timely data source from field instruments, machines, and mechanization frameworks.
When a report arrives at the C-Suite, and a decline underway or increase in energy utilization is noted., it can be hard to follow the main driver. For reasons unknown, the cycles of a continuous improvement cycle may work on a yearly or half-yearly premise, if by any means, and continue as a tedious top-down examination. Be that as it may, imagine a scenario in which these equivalent operational groups had the tools expected to encourage a more logical methodology.
The right automation hardware and software can support these efforts by taking the advantage of Industrial internet of things (IIoT) devices and communication to bear analytics at the edge or a consolidated location. With this article I want to explain how edge automation concepts and digital transformation processes support the collection and analysis of data, warranting users to obtain the awareness necessary to reduce the cycle time of monitoring, analyzing, and improving discrete part manufacturing operations.
A Logical Methodology
Production plants comprise many different types of machinery from different OEMs, equipment, and supporting utilities. At a high level, operations personnel want to :
- Improve throughput
- Maintain quality
- Reduce waste
- Maximize power consumption
- Maximize uptime
Sometimes these measures are executed at a comparatively micro-scale portion of machines or equipment. On different occasions, they have a more extensive degree as a component of full-scale business streamlining.
These advancements are made conceivable by following procedural operating processes of continuous improvement in an iterative pattern. A constructive way of the optimization process model, based on the digital transformation of industrial systems requires the organization to
- Gather data
- Connect it with an architecture
- Analyze it
- Deploy solutions
- Repeat
A few perusers may see that the pattern of continuous improvement has numerous likenesses to the logical technique for examination and learning. Typical principles of the logical technique are expressed as
- Observation and question
- Research and hypothesis
- Experiment and obtain data
- Analyze
- Draw conclusions and report
- Repeat!
The logical technique is readily adapted to provide a complete framework for applying digital transformation methods and optimizing manufacturing operations. As an improvement cycle is accomplished successfully, upgrades to the manufacturing process are applied, with the methodology and procedures used also enhanced.
Digital transformation is a fundamental piece of this improvement cycle, as it is essential for a progressing excursion to digitize the information expected to effectively uphold these endeavors.
So how Little is little?
The data required to survey these characteristics may stream through large enterprise software environments, including supervisory control and data acquisition(SCADA), manufacturing execution systems (MESs), and Enterprise resource planning (ERP) systems. Some of the data may arrive in a very manual format handwritten on forms or entered via spreadsheets.
Much of the captivating data comes from programmable logic controllers (PLC) in the operational technology (OT) domain, including production-related values such as operating rates, part counts, and temperatures. Some data may be facilities information sourced Over site information technology (IT) systems. Still, more information is related to resource administration, such as remote vibration readings and other parameters. Numerous times an out-of-the-box performance asset performance platform is utilized to solidify, contextualize, and visualize such data. This may be valuable for both machine administrators and entire manufacturing plants.
Such a large variety of "little data" sources complicates the format and timeliness of data availability users must consolidate the information, analyze it for useful results, and then apply changes. At that point, they rehash the cycle, regularly and rapidly, in a process of continuous improvement. As we'll see, IIoT based devices and strategies can offer a way to streamline much of the procedure by making a framework where data streams more effectively to speed up the by and large change cycle.
Starting the cycle
A coherent starting is to characterize the objective of prepare change by inquiring, "how do we optimize operations and hypothesizing "by tuning and altering our hardware and manufacturing processes." To set the change cycle in movement, it is vital to assemble all the significant "little data" so it can be totaled into "big data" and after that analyzed.
To be beyond any doubt, the errand can be overpowering due to a huge number of potential data points. Numerous implementers discover that a profitable preparatory step is to perform a resource criticality examination. Clients assess the reliability, perceptibility, and results of equipment performance and failure in an impartial way to recognize the most noteworthy pain points and determine which equipment should be addressed first. Basically, this guarantees that the "low hanging fruit" of optimization efforts are harvested first, leading to early savings and building organization enthusiasm for ongoing projects.
With the most critical assets identified users drill down into the target asset types with a consistent approach to gather the necessary data. This may be:
- production rates or failure indications from the control platform
- other sensed or analytical values
- equipment health indications, such as vibration or bearing temperatures.
- Because of the many stand-alone automation platforms in most industrial plants and facilities encompassing legacy systems and communication protocols- it is common for implementations to stumble over specialized approaches for obtaining little data. Sometimes on the first pass, it is only possible to use whatever data is already easily available.
Targeted approach
Modern lloT methods and products can help users get data out of isolated platforms in many ways:
- Edge devices: single points of data collection, often wireless, transmitting data to other edge solutions.
- Edge gateways: Collect and forward OT, facilities, and asset management system data streams.
- Edge computing: computer-based products able to act as a gateway and perform additional storage and analytical tasks.
- Edge controller: combines deterministic control like a PLC with general-purpose edge computing capabilities.
Edge devices, gateways, computing, and controllers can be added to existing systems as the need and budgets allow. New systems can be designed around edge computing and edge controllers from the beginning, so they are already positioned to obtain and process OT data and make the results available to higher-level systems.
The Activity of identifying and connecting with little data at the edge is rarely a one-time event. Indeed, at each iteration of the improvement cycle, users should evaluate any new needed data. This iteration process is necessary to continually build up the data models in support of deeper analysis.
to be continued.
to be continued