Tesla’s Self-Driving Car Drives Itself Into A Firetruck: Where Liability Truly Lies

When one Utah woman took her Tesla out for a spin this past May, she switched it to Autopilot mode, trusting the claims of the Tesla sales representatives that the car would brake automatically. As long as she occasionally touched the steering wheel, the Tesla Inc. Model S was supposed to use its many cameras and sensors to steer within a lane, change lanes, manage the speed of the car, and brake while driving on the highway. However, Heather P. Lommatzsch’s Tesla didn’t brake when it should have. Instead, it careened into stopped traffic, and slammed into a fire truck at 60 mph.
Lommatzsch’s accident is just one of several that has brought scrutiny upon Tesla’s widely acclaimed Autopilot feature. In May 2016, Joshua Brown was killed when his Tesla Model S, with its self-driving mode activated, crashed into a tractor-trailer. In January 2017, Gao Yanig was killed after engaging Autopilot on his Tesla Model S, and then crashing into a street sweeper. In March 2018, Wei Huang was killed when his Tesla Model X, with Autopilot turned on, collided with a concrete lane divider and burst into flames. In May 2018, two teens were killed when their Tesla slammed into a wall while on Autopilot. Such a pattern of accidents and tragic fatalities begs the inquiry: why haven’t these cars been recalled yet? Have the families of the deceased brought lawsuits against Tesla? Whose fault is it really?

The question of whether or not Lommatzsch will win her suit remains undetermined, as autonomous vehicle litigation presents uncharted territory, with most of the legal landscape limited to settlements and investigations clearing the manufacturers of fault.

After her non-fatal accident, Lommatzsch did sue Tesla, claiming that the sales representatives told her the vehicle would brake automatically if something was in its path as long as she occasionally touched the steering wheel. In a statement about the lawsuit, Tesla spokesman David Arnold said the company “has always been clear that Autopilot doesn’t make the car impervious to all accidents.” While Autopilot is activated, Arnold said “drivers are continuously reminded of the responsibility to keep their hands on the wheel and maintain control of the vehicle at all times.” Car data showed that Lommatzsch hadn’t touched the steering wheel for 80 seconds before the crash, and was in fact looking at her phone at the time of the crash.
In May, Tesla agreed to pay Model S and Model X drivers $5.4 million to settle class action claims that the company delayed safety features and an expensive software upgrade to its autopilot system, and instead rolled out cars with defective traffic awareness features. Such features include collision avoidance and automatic emergency braking, which were meant to be implemented by a December 2016 deadline. The proposed class action suit alleged that Tesla had not only missed the deadline, but that the software powering the features “was nowhere near ready for the vital tasks for which it was sold.” During the months following the deadline, Tesla only debuted a “dangerously defective” traffic aware cruise control system and a “limited front collision warning.” The remaining standard safety features were rolled out later or remained nonexistent when an amended complaint was filed in July. The suit said that the autopilot program that Tesla installed after the deadline included “half-baked software that renders Tesla vehicles dangerous if engaged.”
The question of whether or not Lommatzsch will win her suit remains undetermined, as autonomous vehicle litigation presents uncharted territory, with most of the legal landscape limited to settlements and investigations clearing the manufacturers of fault. However, because there are no precedential court decisions involving an accident yet, Tesla could use such an opportunity to avoid settlement and establish precedent in its favor. For example, the court could issue a judgment that Lommatzsch was negligent in her operation of the autonomous vehicle by not keeping her hands on the wheel, clearing Tesla of liability.
Most likely, however, fault will be pre-determined based on the findings presented by the National Transportation Safety Board, which has begun an investigation into Lommatzsch’s accident. It is possible that the National Transportation Safety Board (“NTSB” will find that Lommatzsch was provided with the necessary visual and auditory clues by the Autopilot system to place her hands on the wheel before the accident occurred.
For instance, regarding the March 2018 accident, the NTSB preliminary report said that the vehicle “provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel…more than 15 minutes before the crash.” The report also said that just one minute before the crash occurred, “the driver’s hands were detected on the steering wheel on three separate occasions for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel.” While this investigation is still ongoing, in the previous May 2016 accident, the NTSB ultimately concluded that the autopilot system was a contributing factor because it gave “far too much leeway to the driver to divert his attention to something other than driving.” However, in the wake of that crash, Tesla has said that Autopilot stopped allowing drivers to ignore repeated warnings to keep their hands on the wheel.
As a result, a liability determination in Lommatzsch’s suit will have to incorporate a number of different factors: was Lommatzsch negligent in her operation of the vehicle? If so, was that negligence a consequence of her reliance on the representations given by Tesla sales representatives, or did the car provide enough auditory and visual cues to alert Lommatzsch to put her hands on the wheel? Based on the court’s ruling, this lawsuit could prove wholly significant in shaping the landscape of autonomous driving litigation.