3.6 Roentgen? Not great, not terrible
HBO's hit series "Chernobyl"

3.6 Roentgen? Not great, not terrible

Most of you will have watched the HBO series "Chernobyl". A factually dubious but powerful story about the greatest nuclear disaster the world ever saw. Those of you know me well will know my connection with the disaster and why it is so important to me.

That's not the point of this article though. I want to talk about two issues that occurred on April 26th, 1986 after 01:23 am. The time that reactor 4 exploded.

  1. Facts form the foundation of response

There were actually disaster plans and procedures at Chernobyl. The Soviets had decided on opting for a "slimmed down" safety model for the RBMK reactors such as not housing them in containment units and using graphite tips on the control rods (the reactor brakes) amongst other things. They did have plans though. Detailed ones.

Clearly, if you were in charge of a nuclear power plant you would have a set of protocols for response depending on the severity of the radiation leak. On the night of the explosion several plant operators were sent in to the building of reactor 4 to measure the radiation levels. This is an obvious response. The operators grabbed the Geiger counters and in they went. It just so happened that these Geiger counters maxed out their readings at 3.6 roentgens per hour. As they took their readings the machines returned the maximum reading, 3.6. Perhaps in shock and panic and not thinking clearly or perhaps due to a lack of training the operators reported back to the central committee (a sort of soviet CMT) that the reading was 3.6 roentgens per hour.

This is not a horrific reading had it been true. The central committee, forgivably, took this as fact. They then decided that the plans to be used were those for small to moderate radiation leaks. They had more staff enter the building and limited their shift to a few hours THINKING they were limiting the risk. This would have actually been the correct response to a leak of 3.6.

The issue was the counters maxed out. The actual reading was 20,000 roentgens per hour. Much, much higher. The response which had gone on for hours would have been totally different had the facts been correct or checked.

2. Enter the bias

You may well be saying "well what about the personnel, firefighters and others who were clearly displaying symptoms of acute radiation sickness"?

When we, as humans, make decisions on a course of action we are highly likely to stick to it no matter what. In fact consistency is often equated to honesty. Flip flopping is seen as a sign of weakness.

So when they inevitably saw the people displaying what would have otherwise been highly concerning symptoms they dismissed it as just a reaction to the mild levels of radiation. They justified the visible glow around the reactor (known as the Cherenkov effect) because "it can occur with very very minimal radiation". They made these external warnings "fit the facts".

The "facts" ,or so they thought, had been established and their response now set in stone.

Great, what does this have to do with cyber?

It is a dramatic example of how vital it is to establish the facts during an incident. Our response will vary depending on those facts. It also highlights the importance of checking those facts. In a crisis, and in Chernobyl in particular, you can imagine the fear, panic and highly emotional state people would likely have been in. Emotional responses, as Christopher Hadnagy notes in all his books, impairs our cognitive processes and we make mistakes.

Finally it is important to keep an eye on the bigger picture in a crisis. Try and become personally comfortable with "changing your mind" on an issue you have at one time appeared decided on. It is a far tougher task than you may think but does serve a useful purpose especially in a crisis.

Kristian Hawkes CMIIA CIA

A Unique Perspective on Risk Assurance, Personal/Professional Development and Mental/Financial Health | Chartered Internal Auditor | Blogger | Investor | Mentor | Social Mobility Advocate | #Kaizen | #IA4.0

3 年

Great observation. Social media is a prime example - everyone reacting, few interrogating the facts!

Mohammad Reza ARDEHI

#knowledge_management #Precommissioning #Commissioning #startup#Maintenance#Oil&gas#powerplant #onshore#offshore

3 年

Unbalanced mind, selfish man assumed that is selfconfidence, savage person,?sometimes should say , a manager or leader need to internal relation knowledge more than? technical knowledge.

回复
Simon Legg

Chief Information Security Officer (CISO) at Hastings Direct

3 年

top post !!! and then a blatant promotion of a complimentary post (normal Legg stuff... little read) https://www.dhirubhai.net/pulse/power-surge-cyber-risksblahh-blahh-simon-legg/ )... learning from crisis and being alive to its emotions is always relevant. Great point Lisa Forte well made

Francis.ek Ks.wery PELLETIER

International Affairs / Diplomacy / National and International Security

3 年

Emotions can be triggered by an attacker to provoke irrational response from the attacked. But if well managed, it can also be used by defenders to react accordingly, and also to counter threats. What do you think about it?

要查看或添加评论,请登录

Lisa Forte的更多文章

社区洞察

其他会员也浏览了