The US government's Highway Safety Agency is again investigating Tesla's Full Self-Driving system, this time after receiving reports of crashes in low visibility conditions, including one in which a pedestrian was killed.
The National Highway Traffic Safety Administration said in documents that it began investigating Oct. 17 when the company reported four crashes after Teslas drove into areas of poor visibility, including sun glare, fog and dust in the air.
According to the agency, in addition to the death of the pedestrian, there was another accident with injuries.
Investigators will look at Full Self-Driving's ability to “detect and respond appropriately to low visibility conditions on the road and, if so, the circumstances contributing to these crashes.”
The investigation covers approximately 2.4 million Tesla vehicles from model years 2016 to 2024.
Early on the morning of October 18, a message was left seeking comment from Tesla, which has repeatedly stated that the system cannot drive itself and human drivers must be prepared to intervene at any time.
Last week, Tesla held an event at Hollywood Studios to unveil a fully autonomous robotaxi with no steering wheel or pedals. CEO Elon Musk said the company plans to release fully autonomous, unmanned vehicles next year, with robotaxis expected to become available in 2026.
The agency also said it will review whether any other similar crashes involving Full Self-Driving have occurred in low visibility conditions and will seek information from the company on whether any updates have affected the system's performance in those conditions.
“Specifically, its review will evaluate the timing, purpose and feasibility of any such updates, as well as Telsa's assessment of their security impact,” the documents said.
Tesla has twice recalled Full Self-Driving under pressure from the agency, which in July requested information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.
The recall was issued because the system was programmed to turn on stop signs at low speeds and because the system did not comply with other traffic laws. Both problems should have been resolved using online software updates.
Critics argue that Tesla's system, which only uses cameras to detect hazards, does not have adequate sensors that could be completely self-driving. Almost all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to help them see better in the dark or in low visibility conditions.
The “Full Self-Driving” recall comes after a three-year investigation into Tesla's less-sophisticated Autopilot system that crashed into the crash, as well as other vehicles parked on the highway, many of which had their warning lights flashing.
That investigation was closed last April after the agency pressured Tesla to recall its vehicles to bolster a weak system to ensure drivers were attentive. A few weeks after the recall, NHTSA began investigating whether the recall had worked.
The investigation, which began Oct. 17, enters new territory for NHTSA, which previously viewed Tesla systems as driver aids rather than self-driving aids. In the new study, the agency focuses on the potential for “Full Self-Driving” rather than just keeping drivers attentive.
This story was reported by the Associated Press.