Skip to main content
  1. Home
  2. Cars
  3. News

Tesla says driver ignored warnings from Autopilot in fatal California crash

NTSB: Tesla driver killed in crash did not have his hands on the wheel

Add as a preferred source on Google
KTVU Fox 2

On Friday, March 23, a Tesla Model X crashed into a concrete divider on U.S. Highway 101 in Mountain View, California. The driver died as a result of the crash — which occurred despite numerous warnings, the car company said.

Now, the National Transportation Safety Board has released its own preliminary report about the crash, noting that the driver did not have his hands on the steering wheel for a full six seconds before the vehicle crashed into the barricade. The NTSB also concluded that the Tesla Model X had its Autopilot function on and engaged during the crash.

Recommended Videos

The federal agency’s report does not differ from Tesla’s own investigation into the crash, though the NTSB has yet to point to a concrete decision about what ultimately caused the fatal accident. In the report, investigators note that all “aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes.”

Some new details that have emerged from the NTSB include that the Autopilot feature was set to maintain a driving speed of 75 mph. We also now know that about eight seconds before the crash, the car was behind another vehicle moving at 65 mph, which probably caused the Model X to slow down a bit. Four seconds before the collision, however, the Tesla stopped following the aforementioned vehicle, causing it to accelerate just before it hit the barricade.

Prior to the crash, the Model X did send two visual and an auditory cue to the driver to take the wheel. “These alerts were made more than 15 minutes prior to the crash,” the NTSB noted.

In Tesla’s initial report following the accident, the company included photos of a crash attenuator at the accident site before retrieving the car’s computer logs. One photo shows the safety device appearing in proper condition on an unstated date. The second image, taken on March 22 by a dash cam in a car driven by a witness to the accident, shows the same barrier crushed from an earlier crash.

According to Tesla, “the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.”

In a follow-up report after Tesla retrieved the car’s logs, the company stated, “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Reacting to criticism that Tesla lacks empathy for crash tragedy when it quotes the relative statistical safety of driving in Telsa vehicles, the company stated, “Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”

Tesla’s Autopilot warnings, and the company’s admonitions about not using the system without keeping hands on the wheel and eyes on the road, do not mean the system is safe from misuse. Following the first Tesla fatality in 2016, the National Traffic Safety Board reported the driver was at fault for not paying attention and for “overreliance on vehicle automation.” The NTSB also reported at the time that Tesla “could have taken further steps to prevent the system’s misuse,” Reuters reported.

The National Highway Traffic Safety Administration (NHTSA) is also investigating the accident.

Human error is widely considered at least partially responsible for more than 90 percent of fatal accidents each year, as documented by a study published by Stanford Law School. Tesla states there is one automobile fatality for all vehicles for every 86 million driving miles, but one fatality for every 320 million miles with Teslas equipped with Autopilot hardware. According to Tesla’s figures, Tesla drivers with Autopilot hardware are 3.7 times less likely to be in a fatal accident.

Updated on June 8 to include findings from the NTSB’s investigation. 

Bruce Brown
Bruce Brown Contributing Editor   As a Contributing Editor to the Auto teams at Digital Trends and TheManual.com, Bruce…
iOS 26.4 adds ChatGPT to you car’s infotainment screen
Apple's iOS 26.4 brings ChatGPT, Gemini, and Claude to your car's screen, adds calming ambient music widgets, and previews the in-car video future that drivers have been waiting for.
CarPlay shown in March 2025.

Apple rolled out iOS 26.4 recently, and while your iPhone got several upgrades, CarPlay quietly had one of its best days in years. The latest iPhone updates bring two meaningful features that can change the way you use CarPlay on your car’s infotainment screen. 

Would you use ChatGPT while driving?

Read more
Sony and Honda’s electric car dream with Afeela series is officially dead 
Sony Honda Mobility has shelved the Afeela 1 and its follow-up, and the EV market has another high-profile casualty.
Machine, Wheel, Adult

Sony and Honda’s shared dream of launching an electric car has just come to an end. The joint venture between the two brands — Sony Honda Mobility — has just announced that plans for the upcoming Afeela 1 electric car have been shelved. Additionally, the follow-up model has been nixed from the roadmap. 

But why did the Afeela go?

Read more
This AI checks if your driving habits signal crash risk
Researchers say eye tracking, heart rate, and personality data can flag risk early.
Person, Wristwatch, Car

A new AI model is taking aim at a question most drivers don’t ask soon enough. How likely are you to crash before you even start the engine?

The system looks at how you behave behind the wheel, pulling in signals like eye movement, heart rate, and personality traits to flag warning patterns early. Instead of waiting for real-world mistakes, it relies on simulated driving tests to surface behaviors linked to dangerous outcomes.

Read more