Lessons from Three Mile Island

Lessons from Three Mile Island

What can safety-critical industries learn from the Three Mile Island incident?

If you work in the mining, oil, gas, chemicals, railway, aviation or other industries, you may wonder what you can learn from an incident that occurred on a nuclear power plant several decades ago.

We’ve recently passed the 45-year anniversary of this event in Pennsylvania, U.S. Essentially, following the tripping of a turbine, operators inadvertently took action that led to the loss of water in Reactor 2 and partial uncovering of the reactor core.

I’ve summarised hundreds of pages of information there, and I apologise for the lack of detail, but the aim of this article is focus on the lessons - the full details of what happened can be found elsewhere.

There are many references to "errors" by control room personnel in the incident reports.

However, operators were misled by false readings of several items of equipment, whilst attempting to address a large number of alarms. Operators had to consider several data points to infer the status of some key equipment – they were not presented with the information that they needed in a timely manner, or in an easy-to-read format. For example, the key indicator of coolant level in the reactor was not directly available, the level had to be inferred from other instruments.

There was no alarm prioritisation, with an estimated 100 or 200 alarms in the first few minutes. Operators were required to use their training to determine the importance of these alarms. The alarm printer could not keep up with the speed at which alarms were coming in (remember this is 1979), and as the incident progressed, the backlog on this printer was several hours behind the alarms sounding. And so, the alarm printer was not useful as a diagnostic aid.

The design of the displays and alarms in the control room did not set the operators up for success. Coupled with this poor design, the training of operators and supervisors was insufficient.

In the investigation there is a focus on operator error in the control room. However, there are wider failures, such as the failure to learn from the operating experience of reactors elsewhere.

If we substitute the process that they were controlling, this incident could clearly happen in other industries. The Cullen report into Piper Alpha echoes some of the findings from Three Mile Island, such as the need for more effective Regulatory control and formal safety assessment.

Key lessons:

  1. As well as considering operator error when discussing people, we need to also consider management behaviours and management systems.
  2. If they are not properly supported, we cannot assume that operator intervention will be beneficial in an emergency.
  3. People may respond differently to how is expected when they are under the stress of an abnormal situation.
  4. Training should include knowledge-based behaviours, as we cannot rely on rule-based behaviours, particularly in complex situations.

I organised a human factors conference many years ago, and invited a control room operator from Three Mile Island to speak. Admittedly, it was partly for selfish reasons – I wanted to hear directly from someone who was present at the time of the incident. His talk was humbling.

I feel saddened that many people still believe that the cause of this disaster was “human error”. What I understand from reading the reports is this - the intervention of control room operators who were doing what they thought was right, with the limited information that they had, under the most difficult conditions.

If things were different and, supported by better designs, they "saved the day", these same people would be considered heroes.

Martin Anderson is a Principal Consultant, working at the intersection of human factors and process safety.

5 facts about Three Mile Island

要查看或添加评论,请登录

HF Integration Pty Ltd的更多文章

社区洞察

其他会员也浏览了