Tesla has reached a settlement with the family of a driver who died in a 2018 crash involving the company’s driver-assistance technology Autopilot, days before attorneys were poised to deliver opening statements.
Terms of the settlement weren’t disclosed in court records.
The wrongful-death suit dealt with a crash involving 38-year-old Apple engineer Walter Huang. The driver died on Highway 101 in California after his Model X sport-utility vehicle crashed into a highway barrier while he was using Tesla’s Autopilot.
Huang’s family, which brought the lawsuit, had pursued a distinctive line of argument, questioning whether the automaker oversold Autopilot’s capabilities and didn’t take sufficient actions to prevent customers from misusing the technology.
Government investigators and the Huangs agree that Huang was distracted in the moments leading up to the crash; Tesla said he was playing a videogame.
The suit was going to be a test of Tesla’s position that drivers, not the automaker, are ultimately responsible for crashes that involve the technology.
Tesla said in a court brief that it agreed to settle the case to end years of litigation. The company asked the judge overseeing the trial in San Jose, Calif., to seal the settlement amount it agreed to as part of the negotiation.
Tesla and the Huang family reached an amicable resolution and the terms are confidential, said Andrew McDevitt, an attorney for the plaintiffs.
Before the settlement, the case was expected to be well-followed in legal circles. The automaker is facing other disputes involving Autopilot.
“Every plaintiff’s lawyer that has one of these cases will be watching,” Matthew Wansley, associate professor at Yeshiva University’s Cardozo School of Law, had said. Wansley has researched automated-driving systems and criticized Tesla’s marketing of the technology.
Elon Musk, Tesla’s chief executive, has demonstrated an appetite for legal risk-taking in the past. He described the carmaker’s approach to litigation in a 2021 tweet: “Tesla policy is never to give in to false claims, even if we would lose, and never to fight true claims, even if we would win.”
A history of scrutiny
Several agencies have been investigating Autopilot, including the Justice Department and Securities and Exchange Commission, which have launched separate probes examining whether Tesla misled customers and investors about how Autopilot performs.
The National Highway Traffic Safety Administration has also been examining Autopilot and the automaker’s more expansive tech called “Full Self-Driving Capability” for years, raising concerns that not enough guardrails are built-in to ensure drivers use the systems appropriately. The regulator has launched more than 40 investigations into accidents suspected to be tied to Tesla’s Autopilot that resulted in 23 deaths.
Autopilot is available on all new Teslas and is designed to help with driving tasks such as steering and lane changes typically on highways. The Full Self-Driving upgrade features navigation on city streets.
Tesla sells subscriptions to enhanced versions of Autopilot as well as to Full Self-Driving. Musk has said such sales could be significant profit drivers for the company.
The automaker says the Autopilot software isn’t designed for fully autonomous driving and allows for drivers to take control when the technology is engaged. Tesla says its website and user manuals make clear that the software requires active driver supervision.
The system deploys a series of warnings to alert drivers if they aren’t paying attention to the road. In December, Tesla issued a safety recall that updated the software underpinning Autopilot, adding more warnings for drivers to ensure they “adhere to their continuous driving responsibility,” the company wrote in a regulatory filing.
Tesla said it made the changes to resolve an investigation by regulators.
The automaker had prevailed in the last two trials, with jurors in the most recent one finding the company wasn’t responsible for the crash because they found no manufacturing defect with Autopilot.
Huang family suit
In the Huang case, the family alleged Tesla drivers were sold on the idea that Autopilot was safer than a human-operated car, and the automaker knew the technology had serious flaws that customers wouldn’t expect to encounter based on how Autopilot was marketed.
On the morning of the March 2018 crash, Huang was making his commute to work after dropping his son off at preschool. With Autopilot engaged while on the highway, Huang’s Model X approached a dividing area that sits between travel lanes of the highway and an exit ramp.
The Autopilot system moved the vehicle off the highway and into the dividing area, and then struck a barrier at about 70 miles an hour. Huang died as a result of blunt force trauma injuries sustained in the crash.
His family’s attorneys said Tesla was to blame for the incident because reasonable drivers believe Autopilot is safe and can navigate highway roads, according to court filings, citing statements and advertising by Musk and the carmaker.
Among the family’s evidence was an email from former Tesla President Jon McNeill. Two years before Huang’s crash, McNeill emailed the company’s head of Autopilot and Musk, saying he had driven several hundred miles in a Model X with the technology activated.
“I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use),” McNeill wrote in a March 25 email that year.
One of the Huang family’s attorneys read the email during a deposition, according to a transcript reviewed by The Wall Street Journal. The Journal couldn’t obtain the full text of his message.
Prove your case
The Huangs would have had to demonstrate that Tesla could have improved its warning system for drivers who used Autopilot, which by then had been in use for a little more than two years, said Richard Cupp, law professor at Pepperdine’s Caruso School of Law.
Tesla has said Huang’s hands weren’t detected on the wheel for six seconds before the crash, and that he knew his vehicle had trouble using Autopilot at this particular spot on the highway before, citing testimony from Huang’s wife and text messages. The company also says Huang was playing a videogame on his phone at the moment of impact.
“The sole cause of this crash was his highly extraordinary misuse of his vehicle and its Autopilot features so that he could play a videogame,” Tesla said in a court brief.
Tesla had strong support for its argument that Huang was misusing Autopilot in the seconds before the crash, Wansley said prior to the settlement.
But the case highlighted the issue of drivers becoming too complacent with partially automated technology, Wansley said.
“These crashes are happening because misuse happens all the time,” Wansley said.
Write to Ryan Felton at ryan.felton@wsj.com and Rebecca Elliott at rebecca.elliott@wsj.com