Elon Musk admits Tesla’s new self-driving software ‘not great’

Tesla CEO Elon Musk admitted Monday that a pilot version of the company’s experimental driver-assistance software is “actually not great” — just a week after federal regulators launched a formal investigation into Tesla’s so-called Autopilot system.

“FSD Beta 9.2 is actually not great [in my opinion], but Autopilot/AI team is rallying to improve as fast as possible. We’re trying to have a single stack for both highway & city streets, but it requires massive NN retraining,” Musk said in a tweet Monday.

But at about 1:30 a.m. ET on Tuesday, Musk followed up that tweet, saying, “Just drove FSD Beta 9.3 from Pasadena to LAX. Much improved!”

Tesla’s FSD software is a more premium iteration of the company’s Autopilot system.

Autopilot, which comes standard on every new Tesla, provides traffic-aware cruise control and autosteering, though the company says a driver must still be attentive behind the wheel.

The FSD package, which sells for $10,000 or $199 per month in the US, offers more features such as auto lane change and smart summon.

Still, the company says FSD requires “active driver supervision and do not make the vehicle autonomous.”

FSD Beta, which offers cutting-edge updates to the full self-driving software, is only available to some drivers and Tesla employees. 

Critics have previously decried Tesla’s real-time testing of its FSD Beta software on public roads as reckless, but there’s scant regulation in the field of autonomous driving software.

interior of driver's cab in a Tesla Model 3.
Tesla is under investigation for its current Autopilot system and overstating the capabilities of the company’s self-driving software.
Zhang Peng/LightRocket via Getty Images

Musk, for his part, has repeatedly defended the company’s self-driving tech, and his concession Monday of the company’s shortcomings with the latest update comes just days after Tesla’s driver-assistance features drew fresh scrutiny.

Last week, the National Highway Traffic Safety Administration announced a formal investigation into Tesla’s Autopilot system after a series of crashes with parked emergency vehicles.

The agency said it had identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles with flashing lights, flares, an illuminated arrow board or cones warning of hazards.

The investigation covers 765,000 vehicles — or nearly every car that Tesla has sold in the US since the start of the 2014 model year, including the Models Y, X, S and 3, the agency said.

Within days, two Democratic Senators called on the Federal Trade Commission to open an investigation and take “appropriate enforcement action” against Tesla for allegedly misleading consumers and overstating the capabilities of the company’s self-driving software.

“Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road,” the senators wrote in a letter last week to FTC chair Lina Khan. “Their claims put Tesla drivers – and all of the traveling public – at risk of serious injury or death.” 

Representatives for Tesla did not return The Post’s request for comment.

source: nypost.com