[ad_1]
DETROIT (AP) – Tesla is recalling nearly every car it has sold in the U.S., more than 2 million, to update software and fix a faulty system that is supposed to make sure drivers are paying attention. when using Autopilot.
Data released Wednesday by US safety regulators say the update will increase warnings and alerts to drivers and limit areas where Autopilot’s core data can be used.
The recall follows a two-year study by the National Highway Traffic Safety Administration in a series of accidents that occurred while using the Autopilot feature. Some died.
The agency said its research found Autopilot’s way of ensuring drivers are paying attention could be flawed and could lead to “misuse of the system.”
The additional instructions and announcements will “incentivize drivers to continue driving,” the documents said.
But safety experts say that while the recall is a good step, it still makes the driver responsible and doesn’t solve the main problem with Tesla’s automatic system that has trouble spotting and stopping for obstacles in their way.
The recall covers models Y, S, 3 and X produced between October 5, 2012, and December 7 of this year. The update should be sent to some affected vehicles on Tuesday, while others will be available later.
Shares of Tesla were up more than 3% Wednesday.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on restricted roads when not activated with a more specialized feature called Autosteer on City Streets.
A software update will limit where Autosteer can be used. “If the driver tries to use Autosteer when the conditions for the installation are not met, the driver will be informed that it is not available through visual and audio indicators, and Autosteer will not engage,” it said. of the record.
Depending on Tesla’s hardware, additional controls include “increasing the visibility” of visual displays, making it easier to turn Autosteer on and off, and additional checks to see if Autosteer is being used outside of lane control modes and if when it comes to traffic control devices. Drivers can be prevented from using Autosteer if they continue to “demonstrate continued and persistent driving,” the documents say.
According to recall papers, the agency’s investigators met with Tesla starting in October to explain “confirmed information” about the monitoring system. Tesla disagreed with NHTSA’s findings but agreed to a recall on December 5 in an effort to resolve the investigation.
Auto safety regulators have for years called for stricter regulation of driver monitoring systems, which often detect whether the driver’s hands are on the driver. car is on the wheel. They have called for cameras to ensure that a driver is paying attention, which other car manufacturers use as a system.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies the safety of autonomous vehicles, called the software update an adjustment that does not show the lack of cameras at night. to watch the eyes of drivers, as well as Teslas that cannot be seen. and stop for accidents.
“The adjustment is disappointing because it does not solve the problem of older cars that do not have the necessary equipment for monitoring drivers,” said Koopman.
Koopman and Michael Brooks, executive director of the Center for Auto Safety, argue that car crashes are an underreported safety risk. “It’s not digging to the root of what the research is looking at,” Brooks said. “Doesn’t answer the question of why Teslas on Autopilot don’t recognize and react to sudden movements?”
According to Koopman, NHTSA apparently decided that changing the system was the best thing the company could get, “and the benefits of doing this outweighed the cost.” next year to argue with Tesla.”
In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the effectiveness of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”
Autopilot can steer, accelerate and brake automatically on its way, but it’s a system that helps drivers and can’t drive despite its name. Independent research has shown that the monitoring system is easy to fool, and has caught drunk drivers. or sit in the back seat.
In its failure report submitted to the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse.”
A message was left early Wednesday seeking more information from the Austin, Texas company.
Tesla says on its website that Autopilot and the use of the Full Self Driving system are intended to help drivers who must be ready to take risks at all times. Full Self Driving is being tested by Tesla owners on public roads.
In a statement posted on Monday on X, first Twitter, Tesla said that safety is more powerful when using Autopilot.
The NHTSA has sent investigators to 35 Tesla accidents since 2016 where the agency suspects the vehicle was operating on a mechanical device. At least 17 people have died.
Research is a part of a major NHTSA investigation in many examples of Teslas using Autopilot to intervene in emergency vehicles. NHTSA has more forbidden in tracking safety issues with Teslas, including a Remember the Full Self Driving program.
In May, transportation secretary Pete Buttigieg, whose department includes NHTSA, said Tesla the Autopilot mode should not be called because he cannot take it by himself.