The Good, the Bad ... and Tesla and Uber
In "Willy Wonka and the Chocolate Factory" circa 1971, Gene Wilder plays a vaguely misanthropic Willy Wonka who leads the young winners of his golden wrapper contest on a tour of the seven deadly sins within his candy factory and labs. (Who can forget Augustus Gloop?) At one point, Mike Teavee, a television-obsessed pre-teen, is so enamored of Wonka's experimental Wonka-vision that he insists on being transmitted - despite a half-hearted warning (see above) from Wonka himself.
Mike is thrilled when the device "transmits" him into a faux TV set, but when he steps out of the set it is clear to all but him that he has suffered a likely irreversible shrinking process to the shock and horror of his mother. Mike's glee is undimmed. Wonka gives a shrug.
Something similar appears to be playing out with Tesla Motors as the company has rolled out Autopilot 2.5 and Tesla owners are taking more liberties than ever with the system. As the first production vehicle with advanced automated driving level 2 capabilities such as lane changing and passing, people have been taking liberties with Autopilot-equipped Tesla's since day one, with fatal results for at least one driver.
Tesla's equivalent of Gene Wilder's half-hearted warning is the admonition that the driver must keep his or her hands on the wheel at all times and pay attention to the road ahead. It's no surprise that drivers continue to ignore these warnings (suggestions?).
The results can be interesting, like the drunk Tesla driver who claimed he wasn't actually driving or the heart attack victim who claimed autopilot got him to the hospital. The latest episode of the Tesla follies is the Tesla driver who put his feet out the window during an "Inside Edition" interview and was subsequently pulled over by a police officer. The driver received a ticket for going too slow (25 miles per hour) in a 65 mile per hour zone - but the ticket was later dismissed. It seems the traffic code may need a rewrite to cope with semi-autonomy.
The real news, though, is the Autopilot 2.5 update. Tesla has been in the midst of a process of playing catch-up since the fatal crash in Florida two years ago, after which Mobileye (a supplier of the camera system in the original Autopilot) parted company with the auto maker.
Forced to rely on its own in-house algorithms, Tesla quickly down-shifted with a software update (downgrade?) and instituted a new geo-fenced version of Autopilot that only worked in certain driving environments and at certain speeds. Over time, the geo-fence expanded and the speed restrictions were relaxed and, with the release of 2.5, Tesla may have finally achieved parity with or surpassed the Mobileye-enabled performance of the original Autopilot.
With Musk's claimed plan to deliver full autonomy via Autopilot, this may be good news or bad. Is the Model S (or X or 3) really ready or capable of full autonomy? And what exactly is full autonomy? Can a Tesla perform like a Waymo? Probably not for a while.
The concern is that Tesla throws the driving candy out to the sinners and more or less looks the other way (Stop. Don't. Come back.) as the misbehavior unfolds. Try to pull Tesla-like shenanigans in a Cadillac with Supercruise and the car shuts the feature down.
There's got to be more to corporate responsibility - enforced in real-time in the vehicle a la Cadillac - than a CEO Pied Piper crooning Wonka-like "Come with me..."
The issue is highlighted in a review of vehicle videos released by the Tempe, Ariz., police in connection with the fatal crash of an Uber autonomous test vehicle with a pedestrian walking a bicycle. The driver was looking down, distracted, but the vehicle sensors ought to have detected the pedestrian in spite of the nighttime circumstances. (Guess who Uber is going to blame.)
This is yet another case where the safety driver can and likely will be blamed - but one would also be correct in holding Uber responsible for the failure of the system. The video evidence suggests a failure of the system hardware or software. Putting such a system on public roads for testing suggests a certain degree of Wonka-like indifference. Automated vehicle-friendly states like Arizona ought to think about implementing sanctions for such failures to encourage a more responsible approach from the testers. Without consequences there will be no progress.
https://tinyurl.com/yalzhgpc - What happened when driver put his Tela on Auto Pilot? - Inside Edition
https://tinyurl.com/ybzab85o - Uber driver looks down for seconds before fatal crash - Ars technica
Technology Executive and Employee Communications at Salesforce
6 年Amen brother Roger!
Partnerships | Global Leadership | Entrepreneurship | P&L | Automotive | Connected Vehicle| Autonomous
6 年Love the Willy Wonka reference here. The question here is who is going to take the social responsibility of putting an immature product on the road - the OEMs, the regulatory bodies, the consumers , combination of the three, all three or none?
Executive Director at Road Safety USA
6 年Excellent, Roger. Is this 'post' available on a regular website anywhere or just on Linked-In? If the former, I would like to link it into our own ongoing discussion.
Please Read & Review Jimi & Isaac books for kids. Solves problems. Invents Stuff.
6 年Here is the Tesla driver-interlock being defeated with a beverage: https://www.youtube.com/watch?v=x08_ZyZFBzI. It says it's water, but it's probably not water. We can also see the auto lane change in play, although it's clear as can be that the sensor field doesn't safely support such operation. The UBER footage is worse than I thought it would be. The target definitely didn't pop out of the bushes into the road or magically appear out of the ether. This one is 100% on UBER. Somebody should go to jail.
Out of Office
6 年Agreed, Roger. Far too much hype at this point as the related technologies are nowhere near mature enough to suggest autonomous vehicles are ready for prime time.