The Future of Self-Driving Vehicles
? Daniel Burrus
Technology Futurist Keynote Speaker, Business Strategist and Disruptive Innovation Expert
Several recent news articles illustrate why my predictions for autonomous vehicles have been correct so far and will continue to be accurate — namely, that “the future of self-driving vehicles will be to use semi-autonomous vehicle technology to lower and/or eliminate accidents, not to eliminate the driver on public roads.”
One of these news articles highlighted the recent fatal Tesla crash near Mountain View, California. At this early stage of the investigation, it’s unclear whether the car was in semi-autonomous mode when the incident (the second Tesla crash to draw regulatory scrutiny) occurred. In the first fatal collision, in 2016, Tesla’s Autopilot system was found partially to blame. Of course, it’s important to note that Tesla is an innovator in the field of semi-autonomous vehicles, and that the thousands of fatalities that occur annually in other modern car models that contain advanced safety features simply do not make the same kind of headlines.
The latest Tesla investigation comes shortly after a self-driving Uber vehicle struck and killed a pedestrian in Tempe, Arizona. As a result of that incident, the governor of Arizona has suspended testing by the company on its public roads, and Uber, in turn, has halted its own tests of the technology in North America.
In a related news story, the artificial intelligence chipmaker Nvidia has suspended global testing of self-driving technology “to learn more from the Uber incident.”
In hundreds of keynote speeches and many articles, I have for some time predicted that the future of semi-autonomous vehicles on public roads will focus on reducing and eliminating accidents rather than out-and-out replacing human drivers with semi-autonomous technology. The real threat that fully autonomous cars might be hacked, as well as the clearly devastating potential for technology errors to lead to tragic results, as we have all recently seen, illustrate my point.
I’ve personally owned a Tesla Model X for almost two years, and I have had no problems with the car or its Autopilot feature. I don’t personally use the Autopilot feature too often, because I enjoy driving, and the Model X is a great car to drive — but when I do use the Autopilot, even though it works great, I keep my eyes on the road. Remember, when a pilot of an aircraft turns on autopilot, he or she doesn’t leave the cockpit unattended, or start reading a magazine or watching a movie. Whether in a car or an aircraft, autopilot is designed to supplement the driver, not to eliminate him or her.
Ideal places do exist for testing and implementing full-vehicle autonomy. For example, private roads, large campuses and large business complexes with vehicles that make routine rounds through a confined area are an ideal place to experiment, and this sort of experimentation is already taking place to some extent.
But when it comes to fully autonomous vehicles, ensuring passenger and public safety means that opening the public roads to cars with no driver at the wheel — or no wheel at all — will have to wait until we can be sure the vehicle in question can’t be hacked, and the tech won’t fail. And that won’t be anytime soon. For the immediate future, at least where public roads are concerned, the emphasis will be on using semi-autonomous vehicles to reduce or eliminate accidents rather than to reduce or eliminate human drivers.
If you enjoyed this article, I would like to give you a copy of my latest book The Anticipatory Organization.
“If you’re in business and you’re not thinking about disruption, you’re not paying attention. And if you haven’t read The Anticipatory Organization, you haven’t learned how to think about—and get ahead of—the disruption that’s headed your way. Read this book!”
Alan M. Webber Co-founder, Fast Company Magazine
Full autonomous driving - goes one step forward and two step back!! Why ? Not because there are accidents caused by the vehicle -- the proportionate number of accidents from human are a lot more. But the key reason is that trust is still missing. Humans trust things as long as it is in their control. As things move outside their control - they start getting uncomfortable. As you mentioned NVidea has suspended the program and Uber also has suspended. A lot of techies are curious to know -- exactly what happened in this case. May be we will see similar trend in other avenues where Robots try to take over from human control
Mechanic at Anthony auto repair
6 年The government needs to stop this bull shit technology people never had a issue driving they own cars
Agile Coach and Entrepreneur, passionate about driving organizational transformation. Coaching teams and leaders to adopt agile practices that foster collaboration, innovation, and sustainable growth.
6 年Well stated Bob While self-driving vehicles and other new technologies improve efficiencies, they also pose a threat to the economy and the vast human labor force they replace. Think of all the commercial drivers (truck drivers, bus drivers. cab driver....)This has been an ever growing development since the Industrial Revolution - the ever downward spiral of finding cheaper labor while making greater profit. Companies like Uber would benefit tremendously
Wild Card - draw me for a winning hand | Creative Problem Solver in Many Roles | Manual Software QA | Project Management | Business Analysis | Auditing | Accounting |
6 年"“the future of self-driving vehicles will be to use semi-autonomous vehicle technology to lower and/or eliminate accidents, not to eliminate the driver on public roads.”" False. Self-driving cars are new paradigm technologies. I wrote about this. https://www.dhirubhai.net/pulse/paradigm-shift-technology-how-affect-your-future-bob/ New paradigm technologies are based on a dehumanizing philosophy that devalues work and workers. The purpose of self-driving cars is to put drivers out of work, because the creators of new paradigm technologies do not value people.