Autonomous Vehicle Litigation

The autonomous vehicle tests in major cities in the U.S. and abroad have sparked our imaginations about a future where driverless cars are the norm.

ByCarolyn Casey, J.D.

|

Updated on

Woman in self driving car

Today some autonomous driving capabilities are available on the consumer market, and driverless car tests have happened in several U.S. cities.

Predictably, there have been problems – crashes, injuries, and fatalities.

This blog provides background on autonomous vehicles, summaries of lawsuits involving Tesla’s autopilot systems, and comments on emerging liability theories.

What are Autonomous Vehicles?

There are two categories of autonomous vehicles (AV). One is the fleet-operated robotaxis in specific geographic areas. For example, Waymo’s autonomous ride-hailing services in Phoenix, Los Angeles, and San Francisco. The other type is the personal vehicle with “hands-off systems” that include various safeguards and require some driver engagement.

There are five levels of autonomous cars.

  • Level 0 to Level 2: Humans must drive these cars and constantly monitor automated support systems including warning systems, braking and acceleration, and steering.
  • Level 3: Level 3 cars can operate independently but at any time the system may request that the driver take over. This level of system can only operate in specific conditions such as traffic jams. This is the highest level of automation available to consumers today.
  • Level 4 (L4): L4 autonomous cars function without a driver, ready to take over the controls. They are currently being tested, developed, and deployed. Driverless taxis are level 4 vehicles.
  • Level 5: This level is the fully autonomous cars that can operate in any condition and environment – the final frontier.

Understanding Autonomous Vehicle Liability

As autopilot systems in cars have become available to consumers problems have occurred and lawsuits have been filed. Driverless car litigation is so new that no clear legal standards or rules exist today. Emerging case law, however, is beginning to shape the key AV legal concepts and potential liability theories.

Product Liability

Traditional product liability claims generally focus on product manufacturing, product design, or the warnings, warranties, or guarantees a manufacturer offers. Design defects often turn on the availability of a “reasonable alternative design (RAD).

  • Early AV cases have involved claims of product liability
  • Plaintiff lawyers who bring product liability claims in AV cases will need experts to speak to RAD. Finding these expert witnesses today may be challenging. As AV cases increase engineers working on driverless car designs may emerge as expert witnesses.

Strict Liability

In product liability cases it will be interesting to see if AV manufacturers can be held strictly liable for defective products. Many states recognize strict liability for manufacturing defects.

  • Plaintiffs would have to show that an autonomous vehicle had a defect, that the vehicle defect actually and proximately caused the plaintiff’s injury, and that the vehicle defect made the product unreasonably dangerous.
  • There have been no findings of strict liability in existing self-driving car lawsuits.

Negligence

The vast majority of litigants in traditional automobile accidents file negligence lawsuits, claiming driver error. Plaintiffs in early AV cases with fatalities and injuries have argued that the manufacturer’s negligence caused the crashes.

Early litigants focused on Tesla’s Autopilot and its Full Self-Driving (FSD) capabilities.

Recurring arguments in early negligence AV cases include false advertising claims against Tesla:

  • Drivers have to pay attention and intervene in the FSD capabilities, despite the name
  • Tesla advertised these FSD features to make consumers believe no driver intervention was needed
  • Tesla advertised the FSD this way despite knowing its limitations

Comparative Fault

Many states such as California have comparative negligence laws that apportion fault. Given the technical nature of autonomous cars, proving how much of the fault lies with the technology and how much with the driver may prove difficult.

A Future Tilt Towards Product Defect Claims?

A Stanford law professor conjectures that in the future with autonomous vehicles, there will be far fewer claims of driver negligence and a much higher proportion of product defect claims. He adds that technical risk/utility issues will dominate over the more typical questions of due care.

Early Tesla Autonomous Vehicle Lawsuits Give Glimpses into Emerging Case Law

Tesla interior

Tesla Settles Lawsuit with Apple Engineer’s Family

In early April 2024, Tesla settled a wrongful death lawsuit with the family of Walter Huang who died after his Tesla Model X SUV crashed into a highway barrier while Autopilot was in use. Fear of harm to Tesla’s reputation likely prompted the settlement days before the trial. Tesla asked the court to seal the case records.

Plaintiff Claims

In the lawsuit, Huang’s family in part alleged safety and design defects in Tesla’s driver assistance systems. Huang’s attorneys also focused on Tesla’s social media and marketing and CEO Elon Musk’s messaging that led consumers to believe Autopilot was safe to use without the driver’s hands on the wheel.

Court filings referred to internal Tesla emails discussing how executives and engineers using Autopilot had become complacent while driving, even reading emails and checking their phones while the system was engaged.

Defense

Tesla had contended that Huang was a distracted driver, not properly using Autopilot because he was playing a video game just before the collision.

NTSB Investigation

The National Transportation Safety Board (NTSB) 2020 investigation likely influenced Tesla to settle for an undisclosed amount days before the trial.
The NTSB found that Tesla’s technology was at least partly to blame for the collision. The agency also said Huang possibly bore some fault as it appeared he was playing a video game on his phone before his Tesla SUV crashed. Road marking and the barriers along Highway 101 may have also played a role.

The federal agency specifically determined that:

  • Tesla’s forward collision warning system did not provide an alert
  • Its automatic emergency braking system did not activate as Huang’s Model X, with Autopilot engaged, accelerated into a highway barrier.

The case was Sz Huang et al v. Tesla Inc. et al in a California Superior Court in Santa Clara County.

CA Jury Says Tesla Not at Fault in Fatal Crash

In October 2023 Riverside, CA jurors found Tesla was not at fault in a lawsuit where a car with Autopilot engaged swerved off of a road and burst into flames, killing the driver Micah Lee, and injuring two passengers. The two surviving passengers sought $400 million in damages for their serious physical injury, mental anguish, and the loss of the driver’s life.

Product Liability Claim

The attorney for the survivors argued that the car sharply swerved off the road due to a manufacturing defect in Tesla’s Autopilot mode.

The plaintiffs’ lawyer told the jury that:

  • Tesla’s driver-assistance system malfunctioned and gave an “excessive steering wheel angle command.
  • An internal Tesla 2017 Autopilot-related safety analysis identified a defect whereby Tesla vehicles unexpectedly move into adjacent lanes or off the road.
  • A driver couldn’t make the car swerve so suddenly.
  • “We know Autopilot went crazy. We know this is a manufacturing defect.”

Defense

Tesla maintained that Mr. Lee was at fault, saying his alcohol consumption affected his ability to drive the Tesla. The company also reminded the jury that there was no evidence Lee had even turned Autopilot on before the collision.

Tesla’s lawyer stated that:

This was classic human error.

The only way that this car steers to 43 degrees in this time frame is that Mr. Lee or someone else in the car played a role in turning that steering wheel.

The jury in this first fatal injury Tesla case sided with Tesla.The case is Molander v. Tesla Inc., RIC2002469, California Superior Court, Riverside County.

Judge in Florida Rules Plaintiff Can Seek Punitive Damages

A Florida judge ruled that there is “reasonable evidence” that Elon Musk and other managers knew the Autopilot system was defective and still allowed consumers to drive Tesla cars, unsafely.

In this wrongful death case, Steven Banner, died when his Tesla roof was shorn off after his car drove under an 18-wheeler truck’s trailer as it turned onto the road he was on.

The November 2023 ruling allows Banner’s wife to proceed to trial and bring punitive damages claims against Tesla for intentional misconduct and gross negligence.

In a positive ruling for all plaintiffs seeking compensation from Tesla, the Florida judge:

  • found evidence that Tesla "engaged in a marketing strategy that painted the products as autonomous" and that Musk's public statements about the technology "had a significant effect on the belief about the capabilities of the products."
  • allowed the plaintiff to argue to jurors that Tesla's manuals and clickwrap agreement warnings were inadequate.
  • said, "It would be reasonable to conclude that the [D]efendant Tesla through its CEO and engineers was acutely aware of the problem with the 'Autopilot' failing to detect cross traffic.”
  • commented that Mr. Banner’s situation was not dissimilar to one shown in a 2016 Autopilot marketing video where Tesla says "The car is driving itself.”

Tesla Wins a Product Liability Case

In what appears to be the first Autopilot case to go to trial, a Los Angeles Superior Court jury found in April 2023 that Tesla’s Autopilot feature did not fail, nor did the company fail to disclose facts.

Justine Hsu sued Tesla in 2020, claiming that while on Autopilot her Tesla Model S swerved into a curb, causing a violent airbag deployment that fractured her jaw, dislodged teeth, and damaged facial nerves.

Product Liability Claim

Hsu alleged Autopilot and airbag design defects and asked for $3+ million in damages.

Tesla Defense

Tesla argued it was not liable, saying Hsu failed to adhere to the Autopilot manual warning on not using it on city streets.

Clear Tesla Warnings

After the ruling, jurors revealed that they believed Tesla’s warnings that Autopilot was not a self-piloted system were clear and that driver distraction was to blame.

Class Action Lawsuit Shut Down

In a September 2022 class action lawsuit, five plaintiffs alleged that Tesla had made “misleading and deceptive” statements about its Autopilot and Full Self-Driving capabilities.

Plaintiffs asserted that:

  • Tesla statements incorrectly characterized Autopilot and Full Self-Driving capabilities.
  • The plaintiffs’ injury was the extra money Tesla charged for these add-on features that failed to work as promised

Arbitration and Statute of Limitations Stops Class Action

A U.S. District Judge shut down the case with two September 2023 rulings. First, the judge ruled four of five plaintiffs could not proceed because they had signed arbitration agreements when they purchased the cars online. Second, the remaining case could not go on because it was filed after the statute of limitations date.

  • Tesla purchasers who opted out of the arbitration clause could still file a class action lawsuit

Where is All this Going?

It’s too early to draw any airtight conclusions about the liability standards emerging in these early AV lawsuits. However, there are a few noteworthy items so far:

Arbitration clauses in online purchase agreements are a problem for plaintiffs.

Distracted driver arguments have also led to good outcomes for Tesla.

Two plaintiff product liability cases failed, but the recent Huang case settlement signals this avenue could be fruitful for plaintiffs.

The Florida court’s finding of reasonable evidence that Tesla knew of defects in the Autopilot system gives some hope to plaintiffs and their lawyers considering product liability lawsuits.

About the author

Carolyn Casey, J.D.

Carolyn Casey, J.D.

Carolyn Casey is a seasoned professional with extensive experience in legal tech, e-discovery, and legal content creation. As Principal of WritMarketing, she combines her decade of Big Law experience with two decades in software leadership to provide strategic consulting in product strategy, content, and messaging for legal tech clients. Previously, Carolyn served as Legal Content Writer for Expert Institute, Sr. Director of Industry Relations at AccessData, and Director of Product Marketing at Zapproved, focusing on industry trends in forensic investigations, compliance, privacy, and e-discovery. Her career also includes roles at Iron Mountain as Head of Legal Product Management and Sr. Product Marketing Manager, where she led product and marketing strategies for legal services, and at Fios Inc as Sr. Marketing Manager, specializing in eDiscovery solutions.

Her early legal expertise was honed at Brobeck, Phleger & Harrison, where she developed legal strategies for mergers, acquisitions, and international finance matters. Carolyn's education includes a J.D. from American University Washington College of Law, where she was a Senior Editor for the International Law Journal and participated in a pioneering China Summer Law Program. She also holds an AB in Political Science with a minor in art history from Stanford University. Her diverse skill set encompasses research, creative writing, copy editing, and a deep understanding of legal product marketing and international legal trends.

background image

Subscribe to our newsletter

Join our newsletter to stay up to date on legal news, insights and product updates from Expert Institute.