Tesla Blames Driver in Latest Fatal Model X Crash

April 11, 2018, 10:27 PM UTC

Tesla says a Model X owner who died in a crash last month near San Francisco is at fault, not the semi-autonomous Autopilot system that was engaged when the SUV slammed into a highway divider.

The electric carmaker issued the statement blaming the driver, Walter Huang, after his family had hired a law firm to explore legal options for them. Huang, 38, died when his 2017 Model X drove his car into the unprotected edge of a concrete highway median that was missing its crash guard.

“According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location,” Tesla said in an emailed statement. “The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

This is the third public statement by Tesla since the March 23 crash—and its strongest language yet in an effort to distance itself from the fatality. Tesla previously said Huang’s hands were not on the steering wheel for six seconds prior to the collision.

Tesla’s complete statement:

We are very sorry for the family’s loss.

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.

A Tesla spokesman would not provide additional information about the crash, including how many times the vehicle issued the alerts to Huang while driving or how many warnings are typically issued before the Autopilot system disengages. Autopilot issues visual and audio alerts if the driver’s hands leave the steering wheel for a period of time. The system is supposed to disengage after several warnings.

The law firm representing the Huang family said that a preliminary review has uncovered complaints by other Tesla drivers of navigational errors by the Autopilot feature. “The firm believes Tesla’s Autopilot feature is defective and likely caused Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy,” according to the law firm, Minami Tamaki.

Autopilot is marketed by Tesla as a semi-autonomous driving system rather than fully autonomous that handles all aspects of driving in certain conditions without expectation of the driver’s involvement. Autopilot includes several features like an automatic steering function that helps drivers steer within a lane, adaptive cruise control that maintains the car’s speed in relation to surrounding traffic, and an auto lane change feature that is supposed to move the vehicle into an adjacent lane automatically when the turn signal is activated—but only when it’s safe to do so.

Tesla’s system does give warnings to remind drivers to keep hands on the wheel when Autopilot is activated. However, not all drivers heed those warnings, and Tesla has been criticized for failing to protect against misuse of the system. A fatal May 2016 crash in Florida was caused partly by the driver overly relying on Autopilot, according to a National Transportation Safety Board investigation.

At the time, the NTSB said that “operational limits” such as Tesla being unable to ensure that drivers are paying attention when a car travels at high speed played a major role in the 2016 fatal crash. The agency recommended Tesla and other automakers take steps to ensure that semi-autonomous systems are not misused. For instance, GM’s semi-autonomous Supercruise system, which is an option in the Cadillac CT6, has a camera system that ensures the driver is looking ahead or it will disengage.

The National Highway Traffic Safety Administration, which also investigated the 2016 crash, found no defects with Autopilot.