Tesla Bears Some Blame for Self-Driving Crash Death, Feds Say

It’s been nearly a year and a half since Joshua Brown became the first person to die in a car driving itself. In May 2016, Brown was on a Florida highway in his Tesla Model S using Autopilot, the semi-autonomous driver assist feature that handles steering and speed during highway driving.

Tesla has always warned drivers that Autopilot isn’t perfect. According to car’s driving manual and the disclaimer drivers accept before they can engage it, the system should only have been used on highways with clear lane markings, strict medians, and exit and entrance ramps. So when a tractor trailer turning left crossed into the Model S’s lane, the system did not recognize it—and the car crashed into its side, killing Brown instantly.

Since then, Tesla has updated the Autopilot system that was controlling the car at the time. The National Highway Traffic Safety Administration, the government’s vehicle safety watchdog, concluded in January that because Brown was supposed to be monitoring the car’s driving, human error—not Tesla tech—caused the crash. And several automakers have introduced their own versions of Autopilot-like software, which purports to help drivers navigate highways safely while leaving some of the work to the machines.

The Human Factor

Autonomous car demonstration on the roads of east London, March 6, 2017.Autonomous car demonstration on the roads of east London, March 6, 2017.
Autonomous car demonstration on the roads of east London, March 6, 2017.

Autonomous Vehicles

Self-Driving Cars Are Confusing Drivers—And Spooking Insurers

“Autonomous ambiguity” leaves people unsure of just what their semi-autonomous cars can—and cannot—do.

Autonomous Vehicles

Audi’s New A8 Shows How Robocars Can Work With Humans

A car that relies on human backup must know things no car ever has before.

Autonomous Vehicles

Cadillac’s Self-Driving System May Be the Smartest Yet

But now there’s a new wrinkle to the story. Tuesday morning, the National Transportation Safety Board, an independent federal body that investigates plane, train, and vehicle crashes, concluded its investigation into the incident. It declared that whether or not Brown paid heed to Tesla’s warnings, the carmaker bears some of the blame for selling a system that allowed that kind of misuse.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” said Robert Sumwalt, the chairman of the NTSB.

The NTSB’s report is the most substantive rebuke yet of not only Tesla, but an industry eager to offer drivers automated features that can be easily abused—with deadly consequences.

View photos

Florida Highway Patrol

Automation

The automotive industry promises that fully driverless cars will nosedive the roughly 35,000 American road deaths per year, 94 percent of which result from human error. But while roboticists wrangle with the complex problems that stand in the way of full self-driving, carmakers are rolling out semi-autonomous features, which help drivers perform some driving tasks.

Systems like Tesla’s Autopilot, General Motors’ Super Cruise, and Audi’s Traffic Jam Pilot already make driving safer, according to preliminary research. The Insurance Institute for Highway Safety has found that vehicles with front-crash prevention are less likely to rear-end and cause injuries in fellow vehicles. NHTSA’s investigation of the Brown crash found that Tesla cars with self-driving capabilities crashed 40 percent less frequently than those without.

“We are inherently imperfect beings, and automated systems can help compensate for that,” says David Friedman, who ran NHTSA for part of the Obama administration and now directs cars and product policy at Consumers Union. “In this case, though, there was a glaring human error and the system made no attempt to compensate for that other than to warn the driver.”

Indeed, these systems are hardly perfect. They mostly just follow lane lines and stay clear of other vehicles, and rely on the human driver to take control if anything goes wrong, like if the lines disappear or the weather gets dicey. Problem is, there’s no consensus on how to make sure that person remains alert. And for the NTSB, technology that improves on the status quo but brings with it the problem of recapturing the flighty human attention span isn’t good enough.

Before the Florida crash, Tesla’s Autopilot system was programmed to give repeated auditory and visual warnings when a driver went a few minutes without touching the steering wheel, but that’s where its powers of persuasion stopped. After Brown’s death, Tesla used an over-the-air software update to improve that process. Now, if the driver doesn’t take the wheel after three warnings, the car will slow to a stop. Other systems, including Mercedes’ Drive Pilot, do the same. GM’s Super Cruise, coming soon in some Cadillac sedans, takes a different tack, using a gumdrop-sized infrared camera to track the position of the driver’s head.

Even that might not be enough. Touching the steering wheel doesn’t mean you’re looking at the road. Keeping your head straight doesn’t mean you’re paying attention. And what combination of audio, visual, and haptic alerts is best for bringing someone back to attention is anyone’s guess.

Recommendations

Which is why the government should play a bigger role in monitoring the design of these systems, the NTSB (which has no rule-making power) said in a set of recommendations to NHTSA. The federal government, for its part, isn’t quite sure how to regulate all this. In fact, an updated policy guideline released by NHTSA Tuesday made no mention of semi-autonomous features like Autopilot.

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” says Tesla spokesperson Keely Sulprizio. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

When NHTSA concluded its investigation in January, it found that Autopilot hadn’t malfunctioned: Because it was designed for driving on highways with exit and entrance ramps, it wasn’t expected to detect a truck turning across the car’s path. The NTSB agrees that Autopilot worked as intended in the Florida crash, but recommended that Tesla and other automakers program their systems so they’re used only where appropriate. Super Cruise, for example, will only engage when its mapping system confirms the car is on a divided highway. Tesla did not respond to questions about how it might restrict its cars’ use of Autopilot.

“The companies developing these systems have to carefully consider how to keep the human in the loop if they want to rely on the human operator as a ‘backup.'”

But the bigger question remains, how do you design a system that does the driving, but ensures humans remain ready to jump in when needed?

The newest research out of the Massachusetts Institute of Technology indicates what we call “distraction” comes down to situational awareness. It’s not enough to occasionally get drivers’ eyes on the road or hands on the wheel, researchers argue; you need to get them to pay attention long enough to absorb and figure out what’s going on. Automakers could install cameras inside the vehicles to monitor driver attention. It’s too late for Tesla to do that for the tens of thousands of Model S sedans and Model X SUVs already on the road. The automaker’s new Model 3 does have an interior camera, but the automaker has not revealed how it might be used.

Still, driver-facing cameras may be hard to pull off. “There has been a reluctance across the industry to deploy cameras in the vehicle because at the end of the day there are individual privacy concerns,” says Bryan Reimer, an engineer who studies driver behavior at MIT.

This report may be a bit of a black eye for Tesla, but it’s a helpful reminder of where emerging autonomous tech needs a bit of help. “It highlights that with the introduction of ever smarter technologies, the companies developing such systems have to carefully consider how to keep the human in the loop if they want to rely on the human operator as a ‘backup,'” says Bart Selman, an AI researcher at Cornell University.

That’s why the Transportation Board called upon NHTSA, that government body in charge of these kind of regulations, to step up and keep a steady eye on the development of semi-autonomous tech. Just a few hours later, Department of Transportation Elaine Chao announced a new approach to automated vehicle guidelines—one that’s much more hands off.