Cognitive Diversity and Psychological Safety Lessons from Space

Cognitive Diversity and Psychological Safety Lessons from Space

Written by Candace Reading

On 1st February 2003, the Space Shuttle Columbia tragically disintegrated upon re-entry into the Earth’s atmosphere, resulting in the loss of all seven crew members. This catastrophe, later known as the Columbia disaster, was a pivotal moment in the history of space exploration. Beyond the technical and operational failures, a deeper issue related to psychological safety and homogeneous thinking emerged as a crucial lesson for all organisations.

What Went Wrong??

The disaster was triggered by damage sustained during launch when a piece of foam insulation struck the left wing of the shuttle. This damage allowed hot atmospheric gases to penetrate the wing upon re-entry, leading to the shuttle's destruction. Despite being aware of the foam strike, NASA managers dismissed engineers’ concerns about potential damage.

An Absence of Psychological Safety?

In the aftermath, the Columbia Accident Investigation Board (CAIB) identified that a culture lacking psychological safety was a significant factor in the disaster. The report identified a “serious flaw” in the decision making processes at NASA. Engineers who voiced concerns were not taken seriously, and there was a general atmosphere where dissenting opinions were suppressed. Employees felt intimidated and feared repercussions for speaking up, which stifled crucial communication and problem-solving efforts.

The culture of innovation on which NASA was founded had become restrained due to funding cuts and schedule pressures creating an aversion to failure.

Suppression of Dissent?

At the time, NASA’s organisational culture did not adequately value incorporating different perspectives; following a more bureaucratic style of leadership and communication. Engineers who raised concerns about potential damage were ignored and discouraged from pursuing their lines of inquiry.?

After the foam strike was identified from launch footage, engineers at NASA's Langley Research Centre expressed concerns about the potential severity of the damage.??

One engineer, Rodney Rocha, suggested that the crew perform a spacewalk to inspect the damage or use the shuttle's robotic arm to look at the wing. Additionally, there was a proposal to have the astronauts use cameras to inspect the affected area from the cabin. Despite these suggestions, NASA management decided against these actions (AIChE) (Texas A&M Today)?

The report into the cause of the disaster also indicates several instances that the Marshall Space Centre management pressured and influenced subcontractors to approve launch decision, despite concerns raised by engineers.?

The Need for Cognitive Diversity

“You’re just as smart as I am, you have a whole different experience, you come from a different background, you have value to add…If I don’t listen to you, then what’s the use of you even being there? We need to take advantage of the whole team, bring people in.”

-Ronald Lee, industrial engineering former student and chief of NASA’s Office of Emergency Management at the Johnson Space Centre

Would this have happened if decisions had been shared within a team representing people of different backgrounds?

NASA was originally founded on a culture of rigorous testing and scientific research. In house technical ability had been instrumental to its success, but at the time of the Columbia Disaster, engineers and astronauts were notably absent from upper management. The leadership team had become more fixated on budget constraints and meeting schedule requirements than flight safety. (Guthrie, Ruth & Shayo, Conrad. (2005). The Columbia Disaster: Culture, Communication & Change.. J. Cases on Inf. Techn.. 7. 57-76.)

Nancy Currie-Gregg, professor of engineering practice, started her career with NASA in the 1980s and served in 4 space missions, the last one in 2002. In charge of the Space Shuttle Program Safety and Mission Assurance office after the accident, expressed a regret for the lack of knowledge sharing with Russian cosmonauts.

“Throughout my flying career, even (American) astronauts thought 99% of the risk was in sitting on that pad in a fully fuelled rocket and during ascent. That’s called a cognitive bias because our only accidents had happened during those times. But if we were to ask our Russian colleagues, they would have had a completely different perspective because both of their accidents happened on entry. Perspectives were completely different.”

Cognitive diversity refers to the inclusion of different ways of thinking, perspectives, and problem-solving approaches within a team. In the case of Columbia, the absence of this diversity compounded the issues arising from the lack of psychological safety.

Cognitive Bias and Risk Normalisation?

Previous missions had experienced foam strikes without catastrophic consequences, leading to a belief that such damage was not mission critical. This normalisation of risk led to downplaying the potential severity of the foam strike on Columbia (Texas A&M Today).

There were limitations in what could be done if the astronauts had observed the extent of the damage, the feasibility of both a rescue mission and on-orbit repairs were highly limited by the technological and logistical constraints of the time.

Had the engineers' concerns about the launch or the foam strike been thoroughly investigated and addressed, it might have led to different actions, potentially preventing the disaster. We will never know but perhaps this knowledge could have spared the astronauts' families from the heartbreaking experience of watching the spacecraft disintegrate in the sky instead of celebrating a safe homecoming.

The Dangers of Groupthink

Groupthink happens when the desire for harmony and conformity in a group, result in irrational decision-making to avoid conflict and maintain a unified front.

Diverse perspectives, which might have challenged prevailing assumptions and led to different decisions, were notably absent. The lack of cognitive diversity and psychological safety were hard lessons for NASA following the Columbia disaster.

Along with improving their damage detection, in-orbit repair and monitoring capabilities, NASA has also enhanced its safety protocols and organisational culture after the Columbia Accident Investigation Board (CAIB) highlighted the need for better communication and a more robust safety culture.

They now place greater emphasis on listening to engineers' concerns and ensuring that all safety issues are addressed promptly. This includes fostering an environment where dissenting opinions are encouraged and valued to prevent oversight due to groupthink (Space.com).

A Lesson for us all…?

Organisations must not only promote psychological safety but also actively seek cognitive diversity. By valuing and incorporating a wide range of perspectives, organisations can enhance their problem-solving capabilities and resilience. Encouraging diverse thinking helps to avoid groupthink, promotes innovative solutions, and ensures that all potential risks are considered.

If you would like to learn more about how to Harness Cognitive Diversity to Achieve Psychological Safety, please join us on the 13th June 2024 at the CIPD Festival of Work for Dr Amanda Potter’s presentation on the EDI Insights Stage at 10:30 am.

If you’ve not yet seen the BBC Documentary “The Space Shuttle That Fell to Earth” you can watch it on the BBC iPlayer if you live in the UK.

References

BBC iPlayer - The Space Shuttle That Fell to Earth?

Guthrie, Ruth & Shayo, Conrad. (2005). The Columbia Disaster: Culture, Communication & Change.. J. Cases on Inf. Techn.. 7. 57-76. (PDF) The Columbia Disaster: Culture, Communication & Change. (researchgate.net)?

Columbia Accident Investigation Board Report Excerpts | Space?

Columbia Disaster: What happened, what NASA learned | Space?

Twenty years after the Columbia disaster, a NASA official reflects on lessons learned (kcsm.org)?

How The Columbia Shuttle Disaster Changed Space Travel - Texas A&M Today (tamu.edu)?

Syrie Bibby (Monahan)

Assoc CIPD; HR Manager & Executive Assistant

8 个月

Looking forward to attending the CIPD event and listen to Amanda's talk too!

回复
Jennifer Duckworth MSc, Chartered MCIPD

Organisational Psychologist | Consultant | Coach | HR Expert | I help organisations improve their working environments and employee experience to enhance wellbeing, engagement and performance.

8 个月

Great article illustrating a really important message for organisations. Thanks for sharing.

Robert Hepworth

Operations & Compliance Manager at Zircon

8 个月

Very interesting read...

Jessica Ross

Senior Consultant & Psychometrician at Zircon Management Consulting Ltd | GMBPsS | Mental Health First Aid Champion

8 个月

So interesting to read, will definitely need to watch the documentary!

Emily Willis

First Class Honours graduate in BSc (Hons) Psychology with a Placement Year | Business Psychologist at Zircon.

8 个月

This really shows how important psychological safety is within our teams and organisations, and how without it, the consequences can really be quite harmful. Great read Candace - I will look forward to watching the documentary.

要查看或添加评论,请登录

Zircon的更多文章

社区洞察

其他会员也浏览了