Tesla autopilot mode crash leaves a lot digital debris
Ikigai Law
An award-winning law firm helping innovation-led companies find efficient solutions.
Tesla is in trouble again as a US federal agency is investigating a crash involving the 2022 Model S in California. The United States’ National Highways Traffic Safety Administration (NHTSA) is investigating the car which might have been on Tesla’s Autopilot mode, its advanced driver assistance system (ADAS) which steers, accelerates and brakes automatically. Reportedly, there have been more than 30 Tesla cars which have been involved in crashes with the Autopilot mode on and there have been 14 deaths because of these crashes.?
At Ikigai Law, we’ve been discussing how we should be viewing electric vehicles (EVs). To put it bluntly, most current EVs are giant batteries, attached to an on-board computer and some wheels. All EVs need a battery management software (BMS) which can monitor telemetry data from different sensors to optimise a ride. Tesla’s cars take this one step further and add a number of cameras where its software can detect different objects, vehicles, people and signs on the road and crunch the data in real time. This allows drivers to engage the Autopilot mode and take their hands off the wheel and let the car run on its own. Tesla’s website says the Autopilot feature still requires active driver supervision and their cars are not autonomous.?
Curiously, Tesla is vehemently opposed to installing a Light Detection And Ranging (LIDAR) system on their vehicles, a standard on its competitors in the autonomous vehicles space like Uber, Google’s Waymo and Toyota. Tesla CEO (and aspiring God-Emperor of Mars) Elon Musk, believes that its artificial intelligence (AI) and deep machine learning (ML) systems are superior. The jury is still out on that, but these changes put the automotive and software world on a collision course. As governments grapple with the proliferation of EVs and frame rules for them, there is a need to monitor the softwares that these machines use.?
From a legal perspective, it raises a number of interesting questions regarding who can be held liable when a car crashes in autonomous mode - the driver, the software, or the car company that uses the software. In Uber’s case, the operator who was testing the autonomous drive feature was held liable for the fatality of a pedestrian crossing the road, and not Uber as a corporation.
Meanwhile in India, mobility company Ola has been putting out many fires (both literal and regulatory) as there have been many reports of their electric scooters overheating and bursting into flame. But what makes things worse, customers have been complaining about software glitches on Ola’s scooters which puts their life in danger on the road. A Guwahati-based lawyer, Balwant Singh, is embroiled in a series of legal exchanges with Ola as the company published telemetry data of his son’s vehicle which was involved in a crash.?
?Here’s a report from a local newspaper detailing the accident:
领英推荐
?Watch this video of Tesla’s Autonomy Day where Musk decries the use of LIDAR:?
Medianama has a great report on the on-going tussle between Ola and Balwant Singh:?
https://www.medianama.com/2022/05/223-ola-legal-notice-customer-social-media-posts/
Ikigai Law analyses the policy opportunities for EVs in this article:?
https://www.ikigailaw.com/driving-electric-vehicles-to-the-future-new-technology-and-policy-opportunities/