Data Integrity in Automated Systems

Data Integrity in Automated Systems

The concept of critical data integrity generated in production requires that a complete, consistent, accurate data throughout its life cycle is maintained, which in some industries can last from 25 to 30 years.

The automated systems industry has included tools in their systems so that the pharmaceutical or medical products industry can prove that it meets the regulatory requirements and that can be proven in the system validation process.

The need to keep all relevant GxP data stored, whether electronic or on paper throughout its life cycle can become a problem, especially for paper data.


Data stored on paper

In some companies, where data is stored on paper for a long time, the data is usually stored in a room that can be attacked by insects that could destroy the paper and consequently all the information contained in them (only an example of a problem that may occur). Besides that, searching for some data stored under these conditions can become an impossible task, which makes storage practically useless.

In case the data storage is electronic, it is necessary to pay attention to these problems during the storage of this critical data for as long as the choice of media in which this data will be stored.

No alt text provided for this image

Sometimes we observed systems implemented in the factory for 20, 25 years, and these systems were often manufactured by suppliers that no longer exist, with emission of data based on different standards and protocols that, because they are no longer used, are not supported.

In addition, this data is often protected by passwords entered by professionals who are no longer in the company. This can create an obstacle when trying to gather that data, visualize it, and even make sure that it's all data.

Generally, these systems were developed without due attention to the data flow (low of data generated by the system), without planning the origin and destination of the data and how that data would be accessed.

Therefore, it is not uncommon to find critical data stored in different ways, sometimes the same data stored in two different databases.

In this case, how to know which data is correct, which was recorded in real-time and what is the copy? Not having a single data repository is always a problem.

Sometimes the production process determines that there are different data repositories, such as a LIMS (Laboratory Information Management System) system, which will probably have its own data repository, but linking those repositories to its production process or database , it becomes a very important action, because an electronic link (interface) is created observing the security parameters, like encryption, within the requirements of data integrity.

With this, the data repository becomes unique, available to everyone who needs to access, view and report data.


Are stored electronic data secure data?

No alt text provided for this image

If someone accesses the database improperly and has access to view the data or file, whether it’s a text file or a CSV file, it’s likely that this user may violate it.


For example, if you can access an Excel file that has a password to open, that password can be broken in minutes, with the use of some tools available on the internet.

There’s a lot of data entered in the systems manually, and because it’s a manual action, the more data that is entered manually in the systems, the more likely it’s to obtain a data that isn’t complete. And this leads to a longer system validation process and all the tasks related to this validation, as well as verification of the associated risks and manual mitigation actions to ensure such integrity.


Is it safe to acquire supervisory data?

Supervised data storage (SCADA type Supervision and Data Acquisition System) without protecting data at the instrumentation level can lead to loss of data in the event of a network failure.

If data is lost at instrumentation level and production is in the middle of a batch, this batch is compromised and needs to go through a series of mitigating actions including opening and diversion investigation before being released to the market. During design, a form of redundancy must be structured.


Reports

An important topic are the reports. Reporting must be able to access all the required data. For example, before releasing a batch of medicines, the Quality Assurance must ensure that release based on production data and analyzes and all events that have occurred related to the batch data.

Reports should be built to provide facilities, for example, reporting only those exceptions with quick and simple access. In this way, reports should be flexible and adaptable. A system is more likely to meet this type of requirement if it is developed and implemented from scratch, that is, a new system.

No alt text provided for this image

An ideal automation system, from the point of view of Data Integrity

Standardized communication protocols

By observing regulations becoming more rigorous and covering more areas, the first step in designing an ideal automated system is to standardize communication protocols (set of rules that allows the transmission of information between two or more entities). Deciding which protocol to use, it is necessary to ensure that the data is encrypted and secure.

The Data Flow should be planned in the most efficient way possible and its access control of the repository, including the registration of who and from where the access was made.


Integrated system covering all production

In addition, the system must cover all stages of production and be integrated. This way you can connect different types of devices on a single platform and have a single visualization tool, enabled for Audit Trail and customized for the professional that is currently logged into the system.

If he is the professional responsible for the packaging area of the company, he has access to the information of his work area, so there’s no need for this professional to have access to data of the raw material used in production, for example.


Possibility of identifying an error in the middle of production

In case of an error in a particular phase, the system must allow it to be possible to return to the exact location where the error occurred and identify it. If it’s a problem in the distribution, inform how the production was stored, where and at what temperature.

If it is a problem in the lot, the system should be able to identify what went wrong and at what time, whether it was the raw material or improper training of the professional responsible, for example.


No alt text provided for this image

Back to the real world

If possible, all data generated in production must be entered and accessed via network, electronically, minimizing manual insertion of the data. The problem, in this case, maybe the number of devices installed in the 'plant', which in some cases can reach hundreds.

This is an action that requires big investments and that happens in very few cases. In some cases, the systems used are old and don’t have inputs for connecting the networked device or even a wireless connection. In others, vendors do not allow their devices to be accessed commercially by customers for data removal. Other providers don’t provide connection entries on their devices, and there’re still those that allow this connection, but they use different communication protocols, which makes it very difficult to capture this data.

No alt text provided for this image

So what is the solution for these problems?

The first action is to split the project into smaller parts and develop a pilot project, usually in a specific area of the plant, which isn’t critical to the whole production process and build a pilot project there.

Scalability software is installed capable of connecting different protocols and hardware and a database, not necessarily on a single server. It will probably be necessary to design redundancy, automatic backup and some method to cover the entire system at all times, in case of failure, no data is lost.

Still, extra hardware may be required because the PLC or the available controller in the plant may be too old to achieve a safe position in relation to the data generated in the production. Apparently it seems to be an action that requires large investments, but it is certainly less harmful compared to the risks of non-conformity in production.

Safety first

The data security project should never let to be a priority. From the first day of implementing a system, as part of the risk analysis, safety should always be considered. Implementing a security strategy in another phase of system development may mean having to go back to the beginning of the project, redo the strategy, and rethink the system engineering.

Because there are some parts of the system that don’t run antivirus, some actions are required to protect the database, such as a perfectly functioning Login directory, removal of USB ports and other entries that allow you to connect to the database, having a connection with the server that is public, and in this case 'public' means within the IT infrastructure, to have another connection to the system coming from the manufacturing plant.

No alt text provided for this image

Cloud Computing and Computer Systems

Currently, there is a great investment in research and training in systems that operate with Cloud Computing. The FDA itself has demonstrated the importance that the subject has been gaining by recently offering training on how to work with SaaS (Software as Service) in a Systems Validation environment.

This is a very important point for computer system vendors, because it is the latest technology and to provide solutions for customers. Until recently, it was impossible to think of seeing or controlling an industry's alarms system from a Smartphone, and today data is entered, reports are read and certificates are viewed by the cell phone. The technology and different ways of working require the automation industry to be prepared for the changes in the development of systems based on Cloud Computing and this is the next great challenge.


System in full condition

What else should be considered, in order for a system to be in agreement with the regulatory bodies, from the point of view of Data Integrity?

Audit Trails!

Audit Trails need to be in the system, in a secure, accurate and real-time way, either in an electronic system, or in a paper-based system or even in a hybrid system.

When a new system is built, the presence of T.I. professionals as well as Quality Assurance professionals is highly recommended, at the risk of having to redesign the entire project and generate risks for the company and the patients' health.

Minimizing the manual insertion of data or making this action safer is ensuring that the data is complete since all critical data that is in the control systems, calculated data, metadata and all information corresponding to them are in the history.

It is also a consistent data, since it refers to the correct and accurate batch during its life cycle, since it was inserted electronically, directly into the system, without possibility of being accessed and changed.

Under these conditions, the best scenario for data integrity in automated systems is likely to have been achieved.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了