Scared Safe: The Importance of Human Error when Evaluating Research Operations for Safety

Scared Safe: The Importance of Human Error when Evaluating Research Operations for Safety

Hazard analysis and risk assessment (HARA) of research operations often does not adequately consider the possibility for human error to potentially create a hazard. The problem is that humans can do anything wrong, in any way imaginable (and some not readily imaginable) at any time.?Unlike equipment which has common failure modes, human failure modes are almost totally unpredictable and so, difficult to determine with any certainty.

Consider these brief examples from my 46 years of experience and decide if they may raise your concern level. Please note I have used the generic term “an operator” even though?in some cases it was degreed scientist or experienced technician.

·????????An operator forgot to re-install a high temperature thermocouple after cleaning a reactor. As a result, the unit overheated, burst its rupture disk,?blew a hydrocarbon vapor and liquid outside the hood where it ignited and deflagrated. No one was injured as, providentially, the laboratory was empty at the time.

·????????An operator was using a mechanical pelletizer to fill catalyst pallets. Noting that one was slightly misaligned he tried to reseat it by hand while the unit was in operation. The device punctured his finger and resulted in the operator having to wait hours for emergency personnel to be able to disassemble the equipment to free his finger.

·????????An operator capped a vent line to generate enough back pressure to take a small one time gas sample. The pressure build-up caused the glass knock out to explode and broke the shield around the vessel. The operator was not injured, luckily being away from the hood at the time of the failure.

·????????An operator inadvertently used a stronger acid to remove residue from a reactor. After pouring in the acid and stirring, he left the immediate area to answer a phone. The reactor detonated and blew off the head due to an unexpected reaction.

·????????An operator noticed over time that they had to open the high pressure reactor manual vent valve more and more each run but thought nothing of it. One day the unit would not vent and, despite an emergency shutdown, started to over pressure as the vent line was totally clogged with polymer residue from the too fast manual venting. Thankfully, the reactor, although over pressured and permanently damaged did not fail before the unit cooled.

·????????An operator impatiently repeatedly initiated a complex automatic sampling system. This resulted in restarting the sequence with many of the valves in the wrong position. Hot viscous liquid poured out of the reactor and blew apart a sample container and shield. The operator was fortunately not injured. The HARA never considered what happened if the operator started the sequence in the middle .

·????????An operator poured wastes into a waste container and noted fumes pouring out. The operator capped the bottle and walked away to allow the fumes to be dissipate in the hood before resuming the operation. The bottle exploded moments later resulting in a fire that damaged a hood and adjacent parts of the laboratory.

·????????An operator removed some tubing from around a pilot plant compressor to clean some components. The operator re-installed the same tubing in what was thought to be the same way but did it incorrectly, inadvertently allowing a high pressure back up cylinder to feed the compressor instead of the low pressure house system. The resulting overpressure resulted in the compressor promptly failing and started leaking. The operator was able to shut down the unit before ignition.

·????????An operator leak tested a reactor before heating. Before reaching the desired final temperature, it started to leak and a fire resulted. A review indicated the operator was using a smaller reactor and the leak test procedure used, designed for a much larger reactor, was not stringent enough to identify a problem.

·????????An operator replaced a leaking plastic fitting on an air line with what was thought to be the same parts found in a drawer. The components were from different vendors and not designed to work together. The fitting failed as soon as the line was re-pressurized with the plastic tubing striking the operator’s safety glasses, cracking one lens.

·????????An operator removed insulation around the control and alarm thermocouples to fix a fitting leak. When the insulation was replaced, it was done poorly and the line overheated and a gasket failed in a properly insulted part of the line resulting in a fire.

One thing almost all these accidents have in common is human error. Were these incidents due to not following procedures, not using an MOC procedure properly, lack of proper training, inattention, distractions, or rushing??Yes, but I believe that all were due to mostly operators being human.?A committed manager, dedicated safety professional, conscientious supervisor, or careful worker can honestly and adamantly proclaim whatever safety declaration they want to espouse but they cannot prevent people making mistakes. The first in class safety procedure fails when a person reads it wrong or selects the wrong valve. The most detailed checklist fails to prevent a problem when someone answers a question wrong or fails to see an incipient problem. The most comprehensive and thoroughly tested interlock system may not prevent an incident when an operator does something no one every envisioned.

Does this mean that to prevent accidents we need to replace people with robots? Not really since they would be designed by people and probably also have accidents, just for different reasons. I think we do need to pay more attention to trying to identify, address, and minimize human error. Here are some thoughts that might help in that effort.

People easily get bored and complacent doing the same thing again and again. Hence, using a very detailed checklist that needs to be filled out each time a more hazardous operation is performed may stop being as effective over time. The first time you use the instrument or do the task, you read it carefully; the twentieth or hundredth time you do it by memory. And my experience says that these procedures when they become rote will – unfailingly – start to be modified, shortened, improved, or customized so they no longer are exactly what was envisioned. Often with little if any review. Or at least the way you remember it. This suggests that these procedures either need to be changed over time to relive the complacency or the operators need to be rotated so that they are learning – and doing – new procedures.

Job safety analysis, field safety reviews, standard work permits are incredibly useful. However, when they cover to the same task over and over again, the actual analysis they receive is often cursory and slip shod. Why? Because people see the same task with the same hazards and the same safeguards they saw yesterday or last week and become complacent. When was the last time you tugged on your seat belt to make sure it was holding? When was the last time you checked to see that your smoke alarm was working? Humans tend to assume things will perform as they last experienced. So, the chances of these last minute reviews identifying something uniquely different are usually much less then we plan on in the HARA. Hence, I think we need to look at other ways to try and really capture those items that are different enough to be an issue and develop effective safeguards.

What portions of a task are most “desirable” to be skipped? Humans are inherently lazy. Not a bad thing, as a colleague of mind never tired of reminding me, as many great ideas come from someone trying to avoid doing any more work than is necessary. ?I sometimes think being lazy is a survival trait as the less energy we expend on a task the more we will have available for something else. So, when a procedure calls for someone to walk across a room to confirm a valve is shut off before starting a flow, it calls for the system to stabilize for an hour before starting the next step in the process, or it calls for the system to be purged 10 times before opening, humans will tend to either forget, ?skip or shortchange that step over time. So, if the step is safety critical then you need some way to ensure they do it properly all the time. Automating the purging, ?shutting off the valve, or programing in the delay time is probably prudent. Conversely, in other cases, experience may indicate the delay time is excessive or no longer required, the purge times can safely be shortened, or the valve really does not need to be closed. Then the procedure should be modified after review. Too often I find that this review is never performed as it is too “onerous” or “time consuming”.

Some tasks are so complicated they are much more likely to be done wrong. The ten stage sampling sequence fairly begs for an operator to be momentarily distracted and forget a step or do it in the wrong order. The multiple draining and purging operation is often just one step away from a release if performed in the wrong order or too quickly. Again, automating these steps may be prudent.

What tasks require an operator’s undivided attention? As any new parent quickly realizes, paying complete attention for any length of time or with any high frequency is difficult. Asking an operator to fill a small sample container which can easily overfill with 5 seconds of inattention is an accident waiting to happen. Telling an operator to be sure there is no forklift traffic in a busy area before moving a heavy drum or pallet of cylinders almost ensures one day they will be too focused on a difficult task to pay enough attention. Perhaps larger sample cylinders (with more time before overflow) or automatic or timed shutting off the flow is more prudent. Perhaps making the movement a two person operation so one can focus on the task while the other focuses on the environment is prudent. Distractions from emails, cell phones, radios, and nearby personnel are routine in research yet most original hazard analysis and risk assessments ignore them completely. Worse is a boring operation that is unlikely to routinely have a problem. How many times has the incident occurred because the operator only “looked away for a moment” which was probably several minutes or much more frequent than admitted? People get bored easily and their attention drifts.

So, I encourage you to consider human error more carefully in your HARAs. Cold eyes reviews and outside safety audits can often help identify areas of concern but can never eliminate the hazard altogether. (And my thanks to Bruce Bullough for the idea for the title of this article!)

Catherine Peltier

Health & Safety | Environment | Professional Engineer (non practising) | MBA

3 年

Excellent and thought-provoking article. Sharing these examples widely helps us turn more "unknown-unknowns" to "known-unknowns" in risk assessment.

回复
Alexandre Dunlop-Brière

Chemical innovation expert with key know-how on: external partnership negotiations, customer and supplier management, lab product development, formulation, scale-up, renewable feedstock, catalysis and polymers.

3 年

Thanks for the read!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了