The US government’s road safety agency is once again investigating Tesla’s “full self-driving” system, this time after receiving reports of crashes in low visibility conditions, including one that killed a pedestrian.
The National Highway Traffic Safety Administration says in documents that it opened the investigation on October 17, with the company reporting four crashes after Tesla vehicles entered low-visibility areas, including sun glare, fog and airborne dust.
In addition to the pedestrian’s death, another accident occurred that injured a pedestrian, the agency said.
Investigators will look at whether “fully autonomous driving” can “detect and respond appropriately to low-visibility conditions on the road, and if so, what conditions contribute to these crashes.”
The investigation covers nearly 2.4 million Tesla vehicles from 2016 through 2024.
A message was left early on October 18 seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.
Last week, Tesla held an event at a Hollywood studio to unveil a fully self-driving robotaxi with no steering wheel or pedals. Elon Musk, the company’s CEO, said that the company plans to produce fully self-driving cars without human drivers next year, and automated taxis available in 2026.
The agency also said it will look into whether any other similar incidents involving “full self-driving” have occurred in low-visibility conditions, and will request information from the company about whether any updates affected the system’s performance in those conditions.
“In particular, his review will evaluate the timing, purpose and capabilities of any such updates, as well as Telsa’s assessment of their impact on safety,” the documents said.
Tesla has twice recalled its “full self-driving” system under pressure from the agency, which in July requested information from law enforcement and the company after a Tesla vehicle using the system struck and killed a motorcyclist near Seattle.
The summonses were issued because the system was programmed to run stop signs at slow speeds and because the system did not comply with other traffic laws. Both issues were to be fixed with online software updates.
Critics said Tesla’s system, which only uses cameras to detect hazards, does not have adequate sensors for fully autonomous driving. Almost all other companies working on self-driving vehicles use radar and laser sensors as well as cameras to see better in darkness or poor visibility conditions.
The “full self-driving” recalls arrive after a three-year investigation into Tesla’s less-sophisticated Autopilot system crashing into emergency vehicles and other vehicles parked on highways, many with warning lights flashing.
That investigation was closed last April after the agency pressured Tesla to recall its cars to enhance the weak system that ensures drivers pay attention. A few weeks after the recall, NHTSA began investigating whether the recall was successful.
The investigation, opened on October 17, enters new territory for the National Highway Traffic Safety Administration (NHTSA), which previously viewed Tesla’s systems as helping drivers rather than driving the car themselves. With the new probe, the agency is focusing on “full self-driving” capabilities rather than just making sure drivers are paying attention.
This story was reported by the Associated Press.