Survivorship bias - or, how not to protect a B17 Flying Fortress

Survivorship bias - or, how not to protect a B17 Flying Fortress

I don't know about you, but I spent quite a bit of my Easter fighting in 1940 Western Europe. My teenage daughter, Zoe, playing the Axis powers, made quick work of France. England was standing alone as the German navy massed in the channel. But, even as all seemed lost, and Operation Sealion was underway, the Royal Airforce swung into action. In a single turn the German battleship and cruisers were sunk. And a valuable life lesson on the importance of maintaining air superiority was instilled in my teenager's mind. (As valuable a life lesson as never get involved in a land war in Asia in my humble opinion).

(Above image - The current state of the battle - England stands alone - the Soviet Union and the US are still neutral.)

When you think of airpower in the Second World War there's at least one name worth remembering. Not a pilot or commander, but a statistician, Abraham Wald. In a series of eight memoranda, while working at the Statistical Research Group within the US military (yes they had an applied mathematics department), Wald worked out the secret to placing armour on aircraft bombers in a way that saved countless lives.

Here's the problem Wald was confronted with. The commanders wanted to place armour on their bombers, but clearly couldn't put it everywhere because it was too heavy. So where is it best to armour a plane to maximise its chance of survival? Wald looked at all the planes that returned from missions and saw a pattern of bullet holes like this:

(Above image: Distribution of bullet holes in aircraft that returned to base after missions. Sketch by Wald. In "Visual Revelations" by Howard Wainer. Lawrence Erlbaum and Associates, 1997.)

So, where should you put the armour? Have a think before reading on.

The commanders saw it clearly. Put the armour where the most bullet holes are. That's where the planes are getting shot the most.

And, of course, that would have been a complete disaster. Wald showed that actually, you should put the armour where the bullet holes aren't.

Why? Well the commanders had fallen for the classic fallacy of survivorship bias. They were only examining the aircraft that made it back to base. The survivors. The missing aircraft, with their locations of bullet holes, were never seen by the commanders. And therefore not taken into account. Wald showed that it was odds-on that those missing aircraft had holes in very different places, on average, than the surviving aircraft.

In short, what Wald's diagram showed was the places an aircraft could take hits and still get home. These were the places you didn't have to put armour on. The exact opposite to what the top brass wanted to do.

The essence of survivorship bias is that you often don't see the failures. In business, in life, and in war. And sometimes it's the failures that have the most important lessons. Like the planes that didn't make it back. Wald's reasoning went on to save lives not only in World War Two, but also in Korea and Vietnam.

You can read a technical account of Wald's statistical reasoning here. And a great account of this story in David McRaney's fantastic "You are not so smart" podcast here.

As always, if you want to receive these weekly posts as an email, drop me a line at [email protected]. Also, I'd love you to say where you've seen examples of survivorship bias in action in life or business. When you start looking for it, it's everywhere.

Stephen (Steve) Graham

Full time carer for my Father.

8 年

Nice work Nick

回复

要查看或添加评论,请登录

Nick Ingram的更多文章

社区洞察

其他会员也浏览了