Alarmed by Tesla’s public self-driving test, state legislators demand answers from DMV
Tesla is developing driverless cars on California’s public roadways using its own customers as test drivers and shrugging off test-reporting requirements — and, so far, the state’s Department of Motor Vehicles has been largely content to look the other way.
But as drivers participating in the “beta test” post videos of their cars making potentially disastrous mistakes, state legislators are growing concerned about the danger that the DMV’s stance poses to other drivers, pedestrians, cyclists and scooter riders.
On Tuesday, the chair of the California Senate’s Transportation Committee, Lena Gonzalez (D-Long Beach), sent a letter to DMV Director Steve Gordon to find out what’s up between the agency and Tesla. The DMV has served as the state’s chief autonomous-driving regulator since the Legislature gave it that power in 2012.
In the letter, Gonzalez cited the apparent poor performance of what Tesla calls Full Self-Driving beta, a $10,000 option that gives owners advanced automated driving features and allows them to test cutting-edge autonomous technologies on public roads. (Beta is a software term that describes a product not quite ready for general sale to the public.)
Tesla Chief Executive Elon Musk has repeatedly stated his intention to sell fully autonomous vehicles that owners can rent out as robot taxis.
YouTube videos, whose veracity has not been challenged by Tesla, have shown FSD beta cars crossing double yellow lines and heading toward oncoming cars, failing to stop for road construction barriers and cars crossing the street, and steering toward metal posts and other common objects. One YouTuber described FSD’s errant steering as an “assassination attempt.” After FSD aimed the Tesla at another car, the driver said that “FSD tried to murder us.”
Gonzalez told Gordon in her letter that “I have seen a number of videos of Tesla vehicles operating with FSD engaged where it appears that serious driving errors were made and collisions were avoided only because of swift action by the driver.”
She noted she lacks data on FSD beta safety but wrote that the DMV “has the knowledge to assess these situations,” and she requested answers to several questions:
- “What is your assessment of the FSD beta trials?”
- “Is there a danger to the public?”
- “If the DMV finds the beta program unsafe, how does the DMV plan to address any potential concerns?”
The DMV said it was reviewing Gonzalez’s letter. Gordon has not spoken publicly about the DMV and self-driving car regulation, despite repeated requests from The Times for an interview over many months.
Gonzalez is not the only California policymaker concerned about Tesla and the DMV. State Sen. Josh Newman (D-Fullerton) told The Times that on Nov. 3, a Tesla Model Y equipped with FSD beta crashed in his district. “I found that a bit unnerving,” he said. According to a complaint filed with the National Highway Traffic Safety Administration, the car’s experimental robot technology steered it into another vehicle.
“The FSD beta mode didn’t work as intended,” Newman said. “Nobody was injured, thankfully. This gets into Tesla advertising something called ‘Full Self-Driving’ when it’s really something else. It creates a high level of risk and puts people in harm’s way.”
While regulators investigate a spate of Teslas steering themselves into parked vehicles, Tesla owners have been reporting faulty collision-avoidance systems.
Newman and other legislators wonder why the DMV treats Tesla differently from the more than 50 other autonomous-vehicle developers in California that adhere to DMV regulations requiring companies to report crashes and “disengagements,” an event in which the robot technology turns full control of the vehicle to a trained test driver.
Waymo, Zoox, Argo AI, Cruise and Motional are among the companies that follow the reporting requirements. They all use trained test drivers, often two to a vehicle. Unlike the others, Tesla does not report crashes or disengagements to the DMV.
Because that crash information is not reported to the DMV, it’s unclear how the agency will be able to provide safety data to the Senate Transportation Committee.
“This is definitely a concern,” said Assemblyman Ash Kalra (D-San Jose). “It’s critical the DMV take this next generation of motor vehicles incredibly seriously.”
Driverless-car developers “need to provide data on crashes and other issues,” he said. “If Tesla is not, we need to know why they are not or why the DMV feels they don’t need to.”
DMV spokeswoman Anita Gore told The Times in a prepared statement that Tesla need not report FSD beta crashes because Tesla informed the agency that Full Self-Driving is a “Level 2” system that requires driver attention.
The levels system was created as a guide for engineers by the Society of Automotive Engineers. SAE Level 0 is no automation at all. Level 1 reflects a single automation feature, such as adaptive cruise control. Level 2 encompasses Tesla’s Autopilot feature and similar systems from Ford, GM and others that combine automatic steering and cruise control. Level 3 describes a system that allows a driver to let the robot fully drive the car, while being ready to take over at any time.
Levels 4 and 5 describe fully driverless cars, with Level 4 cars requiring specific weather conditions or geographic zones and Level 5 able to operate anywhere a normal car can drive.
The levels were never intended to serve as legal definitions or be encoded into law, said Phil Koopman, an engineering professor at Carnegie Mellon University, one of the world’s top driverless-vehicle research centers.
He also suggested that regulators study the full SAE document that describes the levels. It contains a line that Koopman calls crucial: “The level of a driving automation system feature corresponds to the feature’s production design intent.”
In other words, if you are testing a car with the intent to develop it into a Level 4 robotaxi, then it’s a Level 4 system, according to Koopman. “Intent is key to categorizing the autonomy level for Tesla Full Self-Driving,” he added.
The legislation that granted the DMV regulatory control, written when driverless-car technology was in the toddler stage, may need changing, Sen. Newman said.
“The Legislature has an oversight responsibility,” he said. “It may be time for a relevant committee or joint hearing to explore where we are today.”
State Sen. Ben Allen (D-Santa Monica) agrees: “If Tesla is really operating within the boundaries of state law with this technology, then we likely need to change the law to protect public safety.”
No one has been reported killed or seriously injured by errant FSD beta technology, but Jennifer Homendy, head of the National Traffic Safety Board, the federal government’s crash investigator, said, “It shouldn’t require a fatality for regulators and politicians to take action” on Tesla’s FSD deployment.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.