Should cars drive like humans or robots? Tesla forces the question

Many drivers misunderstand the limits of technology already on the road today. The public is confused about what “self-driving” means, for example, as driver-assistance systems become more common and more sophisticated. In a survey last year by the analyst firm J.D. Power, only 37 percent of respondents picked the correct definition of self-driving cars. 

Neither Tesla nor any other company is selling a self-driving, or autonomous, vehicle capable of driving itself in a wide array of locations and circumstances without a human ready to take over. 

Nonetheless Tesla markets its driver assistance systems in the U.S. with names that regulators and safety experts say are misleading such as Autopilot for the standard package, and Full Self-Driving for the premium package. 

At the same time, Tesla warns drivers in owners’ manuals that it’s their responsibility to use the features safely and they must be prepared to take over the driving task at any moment with eyes on the road and hands on the wheel.

The difficulty of navigating an unpredictable environment is one reason truly self-driving cars haven’t happened yet. 

“An autonomous vehicle has to be better and more nimble than the driver it is replacing, not worse,” said William S. Lerner, a transportation safety expert and delegate to the International Organization for Standardization, a group that sets global industrial standards. 

“I wish we were there yet, but we are not, barring straight highways with typical entrance and exit ramps that have been mapped,” he said. 

‘Caught in the cookie jar’

Tesla’s rolling-stop feature was around for months before it drew much notice. Chris, who chronicles the good and the bad of Tesla’s latest features on YouTube under the name DirtyTesla, said his Tesla did automatic rolling stops for over a year before Tesla disabled the feature. He agreed to be interviewed on the condition that only his first name be used due to privacy concerns.

Scrutiny picked up this year. Regulators at the National Highway Traffic Safety Administration asked Tesla about the feature, and in January, the automaker initiated an “over-the-air” software update to disable it. NHTSA classified the software update as an official safety recall.

Critics were taken aback not only by the choice to design software that way but also by Tesla’s decision to test out the features using customers, not professional test drivers.

Safety advocates said they didn’t know of any U.S. jurisdiction where rolling stops are lawful, and they couldn’t determine any safety justification for allowing them. 

“They’re very transparently violating the letter of the law, and that is completely corrosive of the trust that they’re trying to get from the public,” said William Widen, a law professor at the University of Miami who has written about autonomous vehicle regulation. 

“I would be upfront about it,” Widen said, “as opposed to getting their hand caught in the cookie jar.” 

Safety advocates also questioned two entertainment features unrelated to autonomous driving that they said sidestepped safety laws. One, called Passenger Play, allowed drivers to play video games while moving. Another, called Boombox, let drivers blast music or other audio out of their cars while in motion, a possible danger for pedestrians, including blind people.

Tesla recently pushed software updates to restrict both of those features, and NHTSA opened an investigation into Passenger Play. 

Tesla, the top-selling electric vehicle maker, has not called the features a mistake or acknowledged that they may have created safety risks. Instead, Musk denied that rolling stops could be unsafe and called federal automotive safety officials “the fun police” for objecting to Boombox. 

Separately, NHTSA is investigating Tesla for possible safety defects in Autopilot, its standard driver assistance system, after a string of crashes in which Tesla vehicles, with the systems engaged, crashed into stationary first-responder vehicles. Tesla has faced lawsuits and accusations that Autopilot is unsafe because it can’t always detect other vehicles or obstacles in the road. Tesla has generally denied the claims made in lawsuits, including in a case in Florida where it said in court papers that the driver was at fault for a pedestrian death. 

NHTSA declined an interview request. 

It’s not clear what state or local regulators may do to adjust to the reality that Tesla is trying to create. 

“All vehicles operated on California’s public roads are expected to comply with the California Vehicle Code and local traffic laws,” the California Department of Motor Vehicles said in a statement. 

The agency added that automated vehicle technology should be deployed in a manner that both “encourages innovation” and “addresses public safety” — two goals that may be in conflict if innovation means purposely breaking traffic laws. Officials there declined an interview request. 

Musk, like most proponents of self-driving technology, has focused on the number of deaths that result from current human-operated vehicles. He has said his priority is to bring about a self-driving future as quickly as possible in a theoretical bid to reduce the 1.35 million annual traffic deaths worldwide. However, there’s no way to measure how safe a truly self-driving vehicle would be, and even comparing Teslas to other vehicles is difficult because of factors such as different vehicle ages. 

Industry pledges 

At least one other company has faced an allegation of purposefully violating traffic laws, but with a different result from Tesla. 

Last year, San Francisco city officials expressed concern that Cruise, which is majority-owned by General Motors, had programmed its vehicles to make stops in travel lanes in violation of the California vehicle code. Cruise’s developmental driverless vehicles are used in a robo taxi service that picks up and drops off passengers with no driver behind the wheel. 

Cruise responded with something that Tesla’s hasn’t yet offered: a pledge to obey the law. 

“Our vehicles are programmed to follow all traffic laws and regulations,” Cruise spokesperson Aaron Mclear said in a statement. 

Another company pursuing self-driving technology, Waymo, has programmed its cars to break traffic laws only when they’re in conflict with each other, such as crossing a double yellow line to give more space to a cyclist, Waymo spokesperson Julianne McGoldrick said. 

“We prioritize safety and compliance with traffic laws over how familiar a behavior might be for other drivers. For example, we do not program the vehicle to exceed the speed limit because that is familiar to other drivers,” she said in a statement. 

A third company, Mercedes, said it was willing to be held liable for accidents that occur in situations where they promised that their driver assistance system, Drive Pilot, would be safe and adhere to traffic laws.

Mercedes did not respond to a request for information about its approach to automated vehicles and whether they should ever skirt traffic laws.

Safety experts aren’t ready to give Tesla or anyone else a pass to break the law. 

“At a time when pedestrian deaths are at a 40-year high, we should not be loosening the rules,” said Leah Shahum, director of the Vision Zero Network, an organization trying to eliminate traffic deaths in the U.S. 

“We need to be thinking about higher goals — not to have a system that’s no worse than today. It should be dramatically better,” Shahum said. 

source: nbcnews.com