Abraham Wald and the importance of failure
I had first heard of Abraham Wald from my friend about 20 years ago. My friend was pursuing his degree in advanced mathematics at that time and had told me about this almost-obscure American mathematician, who made a great contribution towards making planes safe during WW2 and then tragically died in a plane accident in Southern India in 1950. While the irony of the story was not lost on me, what piqued my interest was the story of what Abraham Wald had done a few years earlier.
1943, Columbia University, USA: Wald has come a long way from being born in a poor Jewish family in the erstwhile Austro-Hungarian empire in 1902 to receiving his PhD in mathematics at the Vienna University to finally being driven out of the country by the rising wave of antisemitism in 1930s Austria. To aid the Allies war effort, Wald found himself in the Statistical Research Group (SRG), an academic group in Columbia University tasked to address military problems. SRG has been commissioned by the US Air Force to determine which part of a plane needs to be armoured. ?You see, the allies were losing a lot of bombers to the German air defense. ?The Air Force wanted to know where and how much of armour was needed to safeguard the plane while keeping it light and maneuverable.?
The sample data that the SRG team had to work with were collected from the planes that survived the German anti-aircraft fire. Those planes showed that most of the bullet holes were at the fuselage and rarely near the engine. The Air Force officers were in favour of armouring the areas where the bullet holes were present, the rationale being that more bullet holes indicate the area where the guns were aiming. But Wald saw something that others had missed. He, or so the story goes, looked at the inherent assumption in the hypothesis – that more bullet holes translate into more weakness in the structure – and questioned whether the assumption itself might be wrong. He pointed out that the Achilles heel of the bomber was the engine. Any bullet to the engine was resulting in the plane going down completely, while a bullet to the fuselage was still allowing the plane to return to its base. Hence the sampling dataset taken from the surviving planes showed more bullet holes in the fuselage than at the engine. Wald recommended armour to be added to the engine which was accepted and that apparently saved hundreds of bombers from being shot down.
领英推荐
Whether this story is overly dramatized is open for debate; maybe Christopher Nolan should make a movie on Wald too. But it did bring forth a type of cognitive/statistical bias called the “survivorship bias”. It’s a selection bias where we tend to overanalyze success while discarding “failure”. Yet it is lessons from “failure” that sets up the path to success. In our corporate and financial world, we glamorize success and for good reasons. But we also need to embrace failure and learn from it. SpaceX celebrated the first time its rocket managed to leave the ground, despite it exploding a minute later in a fiery ball of flame and thereby portrayed as a failure in the popular media.
Survivorship bias is particularly pertinent in the field of process discovery/process mining & process modelling. We build processes looking at the majority of the “happy path” scenarios and finetune them further to achieve the optimal outcome, while handling exception paths as aberrations. Maybe we need to acknowledge the survivorship bias in play there and start looking at “unhappy” flows a bit differently?