Self-driving cars – just blame the victim
Ted Prince
CEO performance prediction, family succession,, behaviorally-based investment, behavioral ratings for CEOs, company founder, thought leader, judge for Harvard Innovation Labs
So now several people have been killed in accidents in self-driving cars. What gives?
I’ve been driving for many years but I don’t regard myself as a good driver. For one thing I've got poor eyesight. I get easily confused at night and sometime in heavy traffic conditions. Here are three of the driving situations in which I feel most uncomfortable:
· Dark rainy night with lots of reflected lights from cars and neon signs
· Driving directly into sun at sunrise and sunset
· Driving on snowed-over road in whiteout conditions
I’m sure you’re a lot better than me but nonetheless there’s probably many drivers out there just like me too.
I wonder how technology is going to make sure self-driving cars cope well in these conditions? I’m not sure how, but then I’m not an engineer and I know nothing about the tolerance, sensitivity and ability for fine distinctions in the relevant technologies. But so far that hasn’t stopped several people from dying.
The consensus out there seems to be that we will get it right shortly. I’m not so sure. I don’t think you can trust the auto and the tech companies to give an honest opinion for obvious reasons. They’ve got too much skin the game.
I think the common perception is that if we can make planes self-flying, then we can do it with cars. But planes aren’t self-flying; they usually have at least two people in the cabin, highly trained and regulated ones at that. And the situation in the air is not remotely as complicated as the situation on the ground in cities with lots of traffic, pedestrians, lights, obstacles and so on.
I do-not think the technology is there yet and I think it will have to go a lot further before it ever does, if ever. But I don’t think that even the technology is the main problem right now.
The main problem, IMHO, is human factors. You know humans, us and our complex and often counter-intuitive reactions to technology.
The self-driving companies tell us that the accidents have happened because the drivers weren’t prepared and didn’t have their hands o-n the wheel. But isn’t that what most people would do in a self-driving car, that is, not having their hands on the wheel? And if something happens suddenly, would they ever have the time to put their hands on the wheel, twist it appropriately and perform other maneuvers, all in the blink of an eye? And if they can’t do that, isn’t that what the vendor is supposed to figure out?
It’s called human factors and it looks suspiciously like the self-driving car companies haven’t figured that out yet. Maybe because they are all engineers and not psychologists or whoever else you need to figure this out. If you can’t figure out the human factors, you’re not even at first base.
Most people seem to be confident that in the end the self-driving cars will get the technology right. But both they and John Q. Public have drastically under-estimated the importance of the human factors issue. We may be years away from figuring that one out even if we fix the technology issue.
Right now if a “driver” can’t take over these emergency maneuvers, it’s said that they are at fault. But isn’t it unrealistic to expect a normal person to react in this way? Aren’t we blaming the victim?
And it’s not just emergency situations. What do you do about difficult but not emergency situations when the “driver” is playing a game on his smartphone? Or talking business to another colleague in the car while simultaneously looking at a spreadsheet?
Shat about a driver who is a suicide risk? Some people might say there’s nothing you can do about someone who wants to commit suicide by car. But why not? If I’m in the car with a wannabee suicide, I’d sure want to be assured that the vendor has my back. If not, sayonara self-driving car.
Now think about this. What if next year we’re going to have millions of self-driving cars on the roads? You know, those self-driving Jaguars from Waymo, not to mention from GM and Uber. How many fatal accidents would occur? Right now we’ve had a few fatalities with maybe just a few hundred self-driving cars on the road. How many fatalities if, say, we suddenly get 100 million such cars on the road?
And the legal and insurance issues haven’t been touched yet. Who is liable in the case of an accident with a self-driving car? The “driver” who- was driving, the self-driving tech vendors, or the car company? What approach will be the insurers take? What about accidents with multiple cars, some being self-driven, some not? And so on and on.
There’s been a lot of brouhaha out there about self-driving cars. I was one of the early believers too. But right now I think the hype has way out-run reality. The boosters have let enthusiasm overcome good sense. Time for the regulators to do their job properly instead of cozying up to the lobbyists.
The self-driving car meme has become too real. We are steering our way into a huge social problem. Time to get our hands back on the wheel before we have an even bigger accident on our collective hands.