Lawsuit blaming Tesla’s Autopilot for driver’s death can go to trial, judge rules
A jury should decide whether Tesla and Elon Musk oversold the capabilities of the electric car company’s Autopilot system and caused the fatal crash of a software engineer who engaged it, took his hands off the steering wheel and seconds later slammed into a truck, a Florida judge has ruled.
Circuit Judge Reid Scott rejected Tesla’s motion to summarily dismiss Kim Banner’s lawsuit accusing the company of causing her husband Jeremy Banner’s death in 2019. In a 23-page ruling, Scott found that Kim Banner’s attorneys presented sufficient evidence to let the case proceed to trial sometime next year. Scott also found that Banner can seek punitive damages from the company that, if awarded, could reach millions of dollars.
Scott, citing other fatal crashes involving Autopilot, wrote last week that there is a “genuine dispute” over whether Tesla “created a foreseeable zone of risk that posed a general threat of harm to others.” Autopilot is supposed to automatically steer and brake the car when engaged.
The judge had ordered his rulings sealed but they were mistakenly available Wednesday on the Palm Beach County court clerk’s website. They were taken down shortly after the Associated Press retrieved the ruling.
Tesla attorney Whitney Cruz declined to comment Wednesday and the company did not respond to an email. Musk eliminated Tesla’s media and public relations department four years ago.
Banner attorney Trey Lytal said in a Wednesday statement that Scott’s ruling “shows how Tesla’s conduct was not just negligent, but involved intentional and reckless decisions that led to the death of customers, including Jeremy Banner.” He believes Scott will soon fully release his decision.
Federal prosecutors are looking into claims made by Tesla and CEO Elon Musk about the capabilities of his company’s automated driving systems.
“The public is entitled to know these findings and we feel strongly that will happen in the next few weeks,” Lytal said.
Scott, in rejecting Tesla’s motion, focused on the company’s marketing and Musk’s comments about Autopilot, and noted other deaths that have occurred during its use. The company says in court documents that it warns drivers that its cars are not fully self-driving, that they still must pay attention to the road and that they are ultimately responsible for steering and braking.
But Scott agreed that Banner’s attorneys had provided enough evidence for the case to proceed. Banner’s attorneys have argued that by naming the system Autopilot, Musk and Tesla implied that the cars are self-driving and don’t require the driver’s full attention. They also cite numerous comments Musk made years before 50-year-old Jeremy Banner’s crash saying that Autopilot was already better than human drivers and would soon be autonomous.
The attorneys also point to a 2016 marketing video for Autopilot that is still on the company’s website. It begins with a statement reading, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
The Tesla then maneuvers through a town on winding roads in traffic. It halts at traffic lights and stop signs, avoids other cars, pedestrians and bicyclists and speeds up and slows down as appropriate. It then parallel parks itself. The camera is positioned to show that the man in the driver’s seat never touches the steering wheel or pedals.
Elon Musk’s track record as a boss is an endless scroll of impulse firings, retribution, tone-deafness on race — and the impregnation of a subordinate.
Under questioning by Banner’s attorneys, Tesla employees revealed that the car in the ad was programmed with mapping software not available to the public and “still performed poorly and even ran into a fence while filming.” The video required several takes and was heavily edited, the attorneys say.
Scott wrote that after reviewing the evidence, he could not “imagine how some ordinary consumers would not have some belief that the Tesla vehicles were capable of driving themselves hands free.”
And that’s what Jeremy Banner did.
Just before dawn on March 1, 2019, he was heading to work on a semi-rural Florida highway in his 2018 Tesla Model 3, which he had purchased months earlier.
While traveling almost 70 mph, Banner activated Autopilot and took his hands off the wheel. To his right, a tractor-trailer leaving a farm moved into his path. The Tesla didn’t detect it and neither it nor Banner braked or swerved. Ten seconds after Autopilot was activated, the car drove underneath the trailer, shearing off the hood and killing Banner instantly.
The National Transportation Safety Board, which investigated the crash, said the truck driver was primarily to blame for pulling into traffic but also said Banner and Tesla were at fault.
“An attentive driver would have seen the truck in time to take evasive action,” the NTSB said of Banner. The board said Tesla’s Autopilot should have safeguards that don’t allow the system’s use on highways that have cross-traffic. The car should also make certain that drivers using Autopilot remain engaged with their hands on the wheel.
“The NTSB and researchers have found that drivers are poor at monitoring automation and do not perform well on tasks requiring passive vigilance,” the 2020 report said.
The trucking company has already reached a confidential settlement with Kim Banner and is no longer part of the lawsuit.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.