The (near) future of driving: Cars that watch you watch them steer
It’s 2025 and you’re cruising down the highway late at night. It’s been a long day and your eyelids feel heavy. All of a sudden, you hear three beeps, lights flash, your car slows down, and it pulls itself safely to the side of the road.
This scenario is closer to becoming reality than you may think, and although autonomous vehicles get all the headlines, most drivers will experience something like it long before they can buy a car that drives itself.
Full self-driving cars are taking longer to arrive than techno-optimists predicted a few years ago. In fact, in a financial filing Wednesday, Tesla acknowledged it may never be able to deliver a full self-driving car at all.
But with features such as automated cruise control, steering assist and automatic highway lane changing, new cars come loaded with driver-assist options. As they proliferate, the task of a human driver is beginning to shift from operating the vehicle to supervising the systems that do so.
That development carries promise and peril. Decades of research make clear that humans aren’t good at paying attention in that way. The auto industry’s answer: systems that monitor us to make sure we’re monitoring the car.
Such systems, usually relying on a driver-facing camera that monitors eye and head movements, already have been deployed in tens of thousands of long-haul trucks, mining trucks and heavy construction vehicles, mainly to recognize drowsiness, alcohol or drug use, and general distraction.
Some new automobile models can already be purchased with option packages that include monitoring systems, usually as part of driver-assist features such as lane keeping and automated cruise control. They include cars from General Motors, Ford, Toyota, Tesla, Subaru, Nissan and Volvo.
One reason for the sudden rush: European regulators plan to require such systems be installed on every new car sold there by mid-decade.
The top U.S. car industry lobby, recently renamed the Alliance for Automotive Innovation, told a Senate panel Tuesday that it welcomes regulation that would require driver-monitoring systems in all new cars sold with driver-assist technologies. The National Transportation Safety Board, after several fatal Tesla Autopilot crashes, has recommended that safety regulators require more robust systems than the one Tesla uses to keep drivers engaged.
So-called advanced driver-assist systems serve as a bridge as companies work to develop safe, fully self-driving cars, which are beginning to appear in very limited locations. Most driverless car developers put tight restrictions on how they can be used and where they can go.
“We’re in an in-between phase at the moment,” said Colin Barnden, a market analyst at Semicast Research.
On the plus side, such technologies can reduce driving stress and, if deployed responsibly, improve safety. At the same time, the less input a car needs from a human driver, the harder it is for that driver to remain vigilant. Humans aren’t good at “monitoring things, waiting for something to go wrong. We just aren’t wired to do that,” Barnden said.
Driver-monitoring systems come in two basic types: eye trackers and steering wheel sensors. In either case, if a driver is detected not paying attention, warnings are sounded through lights or sounds or both; if the driver doesn’t reengage, the car pulls itself to the roadside and stops.
Tesla uses the steering sensor. Practically everybody else uses eye trackers.
Although no system is perfect, “a driver monitoring system using a camera is far superior than through the steering wheel,” said Missy Cummings, a former Navy fighter pilot and director of Duke University’s Humans and Autonomy Laboratory.
A recent video from Consumer Reports shows why. On a closed test track, with well-paved roads with clear markings, a test driver rigged up a way to cheat Tesla’s steering wheel monitoring system. He hung a weight on the wheel, moved into the passenger seat, and started down the test track with the driver seat empty.
“I think monitoring through the steering wheel should be banned,” Cummings said.
Others in the car industry aren’t so blunt but recognize that ineffective monitoring systems are a problem that, if not addressed, risks inviting heavier-handed regulation from lawmakers. In an apparent jab at Tesla, the car industry lobby group recommended Tuesday that “the potential for driver misuse or abuse of a system should be evaluated as part of the design process for driver monitoring systems.”
Eye-tracking got its start about 20 years ago as computer vision scientists and engineers sought a way to monitor drivers for fatigue and distraction. The first applications were commercial.
The eye-tracking market leader, Seeing Machines in Australia, started testing its system in giant mining trucks. A driver who falls asleep and crashes can cost the mining company as much as $15,000 an hour as the truck awaits repair.
The heavy equipment company Caterpillar later licensed Seeing Machines technology for its own vehicles. Today, the company runs an extensive safety network for its customers, who want to reduce crashes and liability costs associated with driver fatigue and distraction, which cause more crashes than alcohol and other drugs combined, according to the company.
If the system detects a driver looking drowsy, the information is sent to the safety section at Caterpillar’s fleet monitoring center in Peoria, Ill. Supervisors are informed. Caterpillar is also rolling out a wristband for drivers called the Cat Smartband to predict drowsiness before a driver even gets behind the wheel by tracking sleep and wake patterns when the driver’s in bed. The company monitors its workforce on a “fatigue risk dashboard.”
Auto safety experts and driverless technology advocates hope someone will put an end to Tesla’s word games and rule-skirting.
Seeing Machines is also aggressively pursuing the long-haul truck market, with its system installed in more than 30,000 trucks, according to the company. The data are sent back to Seeing Machines and used to improve the system.
The possibilities for eye-tracking monitors in automobiles took off around the same time companies such as Google began developing driverless cars. The same factors fueled both: dramatic advancements in machine learning, a type of artificial intelligence, combined with more powerful computers and new chip designs.
The Seeing Machines camera uses infrared light to detect head pose, eye position and eyelid movement. The infrared sensor works at night and through most sunglasses. If a person looks drunk, high, distracted or drowsy, the system can warn the driver, and in serious cases, an advanced driver-assist system can pull the car to the side of the road.
Such systems are not foolproof. “You can have your eyes on the road and not be paying attention at all,” Cummings, the Duke professor, said. “We all get lost in our thoughts.”
David Zuby, head of research at the Insurance Institute for Highway Safety, said “someone might be watching the road with his eyes, but if he’s got a hamburger in one hand and a coffee cup in the other, he may not be able to make the decision that’s necessary in time.”
Semicast’s Barnden said some of the limitations will be addressed as machine-learning programs get better. For example, the system could determine whether someone was eating not by looking for evidence of food but by detecting subtle eye movements.
All this brings up privacy concerns, of course. Amazon uses sensors inside its delivery vans for worker surveillance. Drivers may object, but they have no choice but to submit or quit their jobs.
For consumers, the trade-offs are different and, for the time being, more hypothetical. Right now, companies such as GM and Ford say their eye-tracker data won’t be uploaded to the cloud or stored for more than a few minutes on chips inside the car.
How long will that hold true? As with a smartphone, “You have to trust your manufacturer to not spy on you,” said Martin Krantz, chief executive of Swedish driver monitoring system maker SmartEye.
The lure of marginal revenue could push manufacturers in a different direction as eye-tracking vendors find new ways to monetize drivers’ data.
“That’s the wondrous thing about software,” said Seeing Machines CEO Paul McGlone. Theoretically, after warning a drowsy driver, the system might put the nearest Starbucks on a map and a coupon on the driver’s phone. “The feature list from the [auto manufacturers] is exploding at a ridiculous rate,” he said.
Those possibilities will multiply as truly driverless cars hit the road in greater numbers, said Tal Krzypow, product vice president at Cipia, an Israeli company developing monitoring systems with Mobileye, an Israeli arm of Intel.
As the focus of automakers shifts “from the driving experience to the riding experience,” he said, monitoring systems will move beyond keeping drivers alert to analyzing their moods and expressions in order to “customize and personalize” that experience — by recommending the right movie or advertisement, perhaps.
In other words, the tech that exists now to make sure you’re paying attention to the road could ultimately be harnessed to provide you with ever more immersive distractions.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.