Will feds frustrate Elon Musk’s plan to make Teslas self-driving?

0

Tesla is not alone among technology developers who bristle at regulations that could slow them down, said Koopman, who advocates for industrywide standards to protect the public. But Tesla is pushing the envelope by using its customers as testers rather than safety drivers, as competitors are doing.

“It’s clear these vehicles still need very frequent control by their human drivers,” Koopman said of Tesla’s beta software. “This is pretty janky, immature software. Why would you trust it when it takes so many unsafe actions?”

Rolling-stop issue

The late January recall of Tesla’s FSD software was to remove a function that allowed vehicles to run through stop signs at low speed. Although Tesla accepted the recall voluntarily, Musk took to Twitter to argue the automaker’s case.

“There were no safety issues,” Musk wrote on Twitter. “The cars simply slowed to [about] ~2 mph & continued forward if clear view with no cars or pedestrians.”

NHTSA’s recall notice said the “rolling stops” were faster than stated by Musk and were unsafe.

“A software functionality referred to as a ‘rolling stop’ allows the vehicle to travel through all-way-stop intersections at up to 5.6 miles per hour,” the recall notice said. “Entering an all-way-stop intersection without coming to a complete stop may increase the risk of collision.”

Tesla does not have a public relations department and does not respond to media inquiries. Musk is the de facto spokesman through his Twitter account, and the automaker sometimes issues press releases and blog posts.

Tesla is not aware of any collisions, injuries or fatalities as a result of the “rolling stop” feature that has been present since October 2020 when the beta software was in limited release, according to NHTSA.

In California, Tesla has told regulators that the “self-driving” beta is driver-assistance software not subject to rules for autonomous vehicles, such as background checks for drivers. But the Los Angeles Times reported in January that the California Department of Motor Vehicles is taking a fresh look in light of Tesla’s recent safety issues.

Those issues include an ongoing NHTSA probe into Teslas crashing into emergency vehicles while on Autopilot, as well as consumer complaints of sudden “phantom braking” under certain circumstances.

Tesla’s previous recall for its “self-driving” software, in October, involved a version that could cause a false forward-collision warning, triggering hard braking, according to the recall notice. 

A Washington Post report this month said NHTSA had received more than 100 reports of phantom braking in three months.

FOLLOW US ON GOOGLE NEWS

 

Read original article here

Denial of responsibility! TechnoCodex is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment