As if consumers were not already feeling anxious enough about the state of the world and their safety within it, a new concern has reached the starting line: In contrast with recent official safety reports, a recent news story calls into question the safety of Tesla’s Autopilot software—the “advanced driver assistance system that enhances safety and convenience behind the wheel,” in Tesla’s words. Do consumers now need to start worrying about robo-cars crashing into them every time they leave the house?
According to the National Highway Traffic Safety Administration (NHTSA), Teslas and other cars with advanced driver-assistance technologies have been involved in nearly 400 crashes in the previous 10 months. Teslas accounted for 273 of these crashes, followed by Honda with 90 crashes, Subaru with 10, and other automakers that had 5 crashes or fewer. But in addition to crashes, Teslas using Autopilot systems have been spotted rolling through stop signs and spontaneously bursting into flames. The latter event is frequent enough that a search for “Tesla spontaneous combustion” brings up some 131,000 Google results. A recent nightmarish video showed a Tesla catching on fire, then locking itself down and trapping its hapless driver inside. The driver later recalled, “The doors wouldn’t open. The windows wouldn’t go down,” forcing them to kick out the window.
Driver assistance systems are supposed to make driving safer, with features meant to minimize human error. These features can include capabilities like an auto-braking function that deploys if the car is about to have a crash and a migration alert system that warns a driver if a car is drifting out of its lane.
Although Tesla touts its vehicles’ “full self-driving” capability, that name is misleading. The car does not actually drive itself. Instead, it includes and relies on Tesla’s full suite of supercharged driver assistance apps, including the Traffic Aware Cruise Control function, a driverless feature that allows the car to match the speed of the car in front of it—even starting and stopping without human assistance—and an Autopark function that, per Tesla’s website, “helps parallel or perpendicular park your car, with a single touch.”
Even as car companies continue to promise that these technologies make driving safer, little data offers publicly available, reliable measures of their safety. In a way then, “American drivers—whether using these systems or sharing the road with them—are effectively guinea pigs in an experiment whose results have not yet been revealed.”
And in this experiment, clear results have not emerged yet. It is not entirely certain why the crashes are happening. Human error may be one reason; drivers are not supposed to cede complete control to the car itself, even using the most advanced features, but they might be perhaps lulled into a false sense of security or complacency by the prospect of so much automated stuff. The marketing of that stuff, arguably, might encourage drivers to adopt such laissez-faire attitudes. Some current marketing messages seem to imply that the cars will soon get to the point that drivers can relax and even take a nap while commuting to work.
But human error might not account for all of the crashes, and the not knowing is central to the overall problem. Without well-established, comprehensive, objective data about how the automated systems perform, consumers naturally lack confidence in their adherence to safety standards, and they express skepticism in response to claims of enhanced safety benefits. Meanwhile, in May, the NHTSA provided more bad news: 2021 traffic fatalities reached a 16-year high.
So what are companies, regulators, and drivers supposed to do with this information, or lack thereof? If only there were an easy answer. Companies and regulators—though not necessarily in cooperation—are trying to get more and better data that would provide more insights into how these technologies are affecting safety. But it’s not likely those results will be available anytime soon.
As for drivers? Whatever car you’re driving, stay alert, and keep your hands on the wheel. At least until self-driving cars are actually driving themselves.
Discussion Questions:
- Why might driver assistance technologies cause crashes instead of preventing them?
- What should car companies do to make customers feel more assured that these vehicles are safe? What should they do to make the cars actually more safe?
- What data would you want to see about crashes, to help you understand why they are happening?
Source: Cade Metz, “How Safe Are Systems Like Tesla’s Autopilot? No One Knows,” The New York Times, June 8, 2022; Braden Carlson, “Are Self-Driving Cars Safe? Most Americans Don’t Think So,” MotorBiscuit, March 18, 2022; “Newly Released Estimates Show Traffic Fatalities Reached a 16-Year High in 2021” http://www.nhtsa.gov, May 17, 2022; Neal E. Boudette, Cade Metz, and Jack Ewing, “Tesla Autopilot and Other Driver-Assist Systems Linked to Hundreds of Crashes,” The New York Times, June 15, 2022; Faiz Siddiqui, Rachel Lerman, and Jeremy B. Merrill, “Teslas Running Autopilot Involved in 273 Crashes Reported Since Last Year,” Washington Post, June 15, 2022; Faiz Siddiqui and Reed Albergotti, “‘Full Self-Driving’ Clips Show Owners of Teslas Fighting for Control, and Experts See Deep Flaws,” Washington Post, February 10, 2022; Steven Loveday, “Tesla Model Y Catches Fire While Driving, Driver Struggled to Escape,” InsideEVs, May 23, 2022