Tesla Accidents on Autopilot

Tesla Accidents from Self-Driving Vehicles Prompt Recall

It’s become common to read news stories about Tesla Accidents. Semi-autonomous vehicles claim to be safer than vehicles with human drivers because they operate through a sophisticated technological system of computers, cameras, sensors, and radar, thereby eliminating the unpredictable, error-prone, and distracted driving of human drivers. Automakers market self-driving technology as a safety feature that improves roadway safety by reducing the number of crashes and deaths caused by human error, but three recent crashes involving self-driving vehicles have automakers, regulators, and consumers re-evaluating the safety of self-driving technology, and whether semi-autonomous vehicles are ready for the open road.

“Tesla has recalled over 40,000 Model S sedans and Model X SUVs from the 2017 through 2021 model years because the electronic power steering may suddenly stop working on rough roads or after hitting a pothole.” – ConsumerReports

Tesla Accidents and Fatality Statistics

Telsadeaths.com has compiled the following totals, updated as of 12/20/22:

  • Tesla Deaths: 335 
  • Fatalities Involving Tesla Autopilot: 19
  • Tesla Fires: 143
  • Fatalities Involving a Tesla Fires: 44

“Automakers reported nearly 400 crashes of vehicles with partially automated driver-assist systems, including 273 involving Teslas, according to statistics released Wednesday by U.S. safety regulators.” – NPR

Elon Musk’s Beta Defense of Autopilot

The three crashes, including one fatality, all involve Tesla electric vehicles operating in autopilot mode. AutoPilot was released on Oct 14, 2015. Autopilot mode is an “assist feature that requires you to keep your hands on the steering wheel at all times” according to a Tesla statement. Before engaging the autopilot assist system, Tesla vehicles caution the driver the technology is in beta mode, and ‘‘requires explicit acknowledgment that the system is new technology’’ by the driver before the vehicle will operate in autopilot mode. Tesla CEO Elon Musk tweeted of the autopilot assist system, “[P]oint of calling it ‘beta’ was to emphasize to those who chose to use it that it wasn’t perfect.”

A tweet from Elon Musk saying one death per 320M miles (includes occupants, other vehicles, cyclists/peds).

Autopilot Mode Fatalities

The first crash involved a Tesla 2015 Model S sedan operating in autopilot mode in Williston, Florida on May 7, 2016, when the vehicle failed to detect a tractor-trailer turning in front of it against the bright sky. The driver, Tesla enthusiast Joshua Brown, was killed when his vehicle drove under the 18-wheeler after neither Brown nor the vehicle’s autopilot system detected the semi-turning in front of the vehicle, and neither applied the brakes.

The driver of the tractor-trailer contends Brown was distracted watching a Harry Potter movie after he heard the movie playing in the Tesla after the crash. After Brown’s fatal May crash, two more crashes in two weeks occurred involving Tesla vehicles allegedly operating in autopilot mode.

Autopilot Mode Cited as Careless Driving

A Michigan man and his son-in-law escaped serious injury on the Pennsylvania Turnpike on July 1, 2016, after the 2016 Tesla Model X SUV they were riding in struck a guardrail, and hit a concrete embankment, before coming to rest on its roof. The driver claims the Tesla autopilot system was in use at the time of the crash but Pennsylvania State Police cited him for careless driving.

Electronic Power Assist Steering (EPAS) Issues

The third crash occurred on July 10, 2016, in Whitehall, Montana. The 2016 Tesla Model X SUV was operating in autopilot mode traveling approximately 60 mph in a 55-mph zone when it suddenly jerked to the right and crashed into a guardrail. The 2 occupants in the vehicle sustained only minor injuries despite the right side of the Tesla suffering significant body damage. The driver blames the autopilot system for failing to detect an object on the road.

NHTSA Tesla Safety and Recall Information

  • Tesla Model S- 13 Recalls, 3 Investigations, 87 Complaints
  • Tesla Model 3 – 13 Recalls, 4 Investigations, 406 Complaints
  • Tesla Model X – 7 Recalls, 9 Investigations, 344 Complaints
  • Tesla Model Y – 14 Recalls, 4 Investigations, 620 Complaints

NHTSA is currently investigating the first two crashes, the safety of the autopilot system, or whether any defect in the system contributed to the crashes. In an unusual move, the NTSB (National Transportation Safety Board) opened its own investigation into the fatal accident in May. Tesla is also investigating if the autopilot feature was, in fact, engaged in all of the crashes as claimed and whether the autopilot technology is safe. Musk told Bloomberg News, “We’re going to be quite clear with customers that the responsibility remains with the driver. [W]e’re not asserting that the car is capable of driving in the absence of driver oversight.”

These three crashes come at a time when the NHTSA (National Highway Traffic Safety Administration) is set to release new rules and regulations about test-driving semi-autonomous vehicles on public roadways. NHTSA Administrator Mark Rosekind believes semi-autonomous vehicles need to be “at least twice as safe as human drivers” to significantly reduce the number of accidents and roadway deaths due to human error. Automakers counter that this life-saving technology will eliminate human error in 94% of traffic deaths, and are racing to get their version of a self-driving vehicle to market. Currently, there are over 830,000 Tesla vehicles equipped with autopilot features on roads worldwide.

How Safe is Self-Driving Technology?

This quick succession of crashes involving the autopilot assist system begs the question about the safety of self-driving technology, and whether we are ready for its use in the real world. Do you embrace this new technology? Do you believe self-driving vehicles will be more or less safe than vehicles operated by human beings? Do you expect self-driving vehicles to operate accident-free? Do you think people have unrealistic expectations about the safety features and operating capabilities of semi-autonomous vehicles? Do you believe operators of these vehicles have more or less responsibility when in autopilot mode? Who should be responsible if a crash occurs while the vehicle is operating in autopilot mode: the driver or the automaker?

Tesla Accidents Raise Safety Questions About Self-Driving Vehicles

There are also questions surrounding the ethics of operating semi-autonomous vehicles, and whether a computer, versus a human being, can make ethical distinctions in “split-second, life or death driving decisions on the highway,” according to a related story in the New York Times.

We’re still in the dark ages when it comes to self-driving vehicles, and there are still many questions about the use of semi-autonomous vehicles, their safety, and their place on American roadways. Federal and state rules and regulations are still being developed to guide their safe operation on public roadways where most vehicles are still being driven by error-prone, distracted human drivers, but we are not far from a time when self-driving vehicles will become more commonplace on American roadways, and maybe even in your own driveway.

Injuries caused by defective vehicles can range from unpleasant side effects to bodily harm and even death. Anyone who is a victim of an unsafe vehicle may be awarded damages to be paid by the manufacturer, designer, or seller of that vehicle. Contact us for a free consultation.