Skip links

U.S. Investigates Tesla’s Full Self-Driving System Following Fatal Crashes

The U.S. government’s road safety agency has launched an investigation into Tesla’s “Full Self-Driving” system after receiving reports of crashes occurring in low-visibility conditions, including one incident that resulted in the death of a pedestrian.

The National Highway Traffic Safety Administration (NHTSA) stated in documents that it opened the probe on Thursday after Tesla reported four crashes involving its vehicles encountering sun glare, fog, and airborne dust. Among these incidents, one resulted in a pedestrian fatality, while another involved injuries.

Investigators will assess whether the “Full Self-Driving” system can effectively detect and respond to reduced visibility conditions and will analyze the contributing circumstances of these crashes. The investigation encompasses approximately 2.4 million Teslas manufactured between the 2016 and 2024 model years.

A request for comment was submitted to Tesla on Friday. The company has consistently stated that the system is not capable of driving itself and that human drivers must remain ready to intervene at all times.

Last week, Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi lacking a steering wheel or pedals. CEO Elon Musk, who has made previous promises about autonomous vehicles, announced plans for autonomous Models Y and 3 to operate without human drivers next year. He indicated that robotaxis without steering wheels would be available in California and Texas starting in 2026.

The investigation’s implications for Tesla’s self-driving ambitions remain uncertain. NHTSA would need to approve any robotaxi design that lacks pedals or a steering wheel, a decision likely delayed until the investigation concludes. However, if the company attempts to deploy autonomous vehicles within its existing models, it would be subject to state regulations, as there are currently no federal regulations specifically addressing autonomous vehicles, though they must comply with broader safety standards.

NHTSA also indicated that it will examine whether similar crashes involving “Full Self-Driving” occurred under low-visibility conditions and will seek information from Tesla regarding any software updates that might have affected the system’s performance in those situations. This review will focus on the timing, purpose, and capabilities of any updates and Tesla’s assessment of their safety impacts.

Tesla reported the four crashes to NHTSA under an agency order applicable to all automakers. An agency database indicates that the pedestrian was killed in Rimrock, Arizona, in November 2023, after being struck by a 2021 Tesla Model Y. The crash occurred on Interstate 17 shortly after 5 p.m. on November 27, when two vehicles collided, blocking the left lane. A Toyota 4Runner stopped, and two people exited to assist with traffic control when a red Tesla Model Y struck the 4Runner and one of the individuals. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

According to Raul Garcia, public information officer for the Arizona Department of Public Safety, the collision occurred due to sun glare affecting the Tesla driver’s visibility, which resulted in no charges against the driver. Sun glare was also cited as a contributing factor in the earlier incident.

Tesla has previously recalled the “Full Self-Driving” system twice under pressure from NHTSA, which sought information after a Tesla using the system struck and killed a motorcyclist near Seattle in July 2023. The recalls addressed issues such as the system running stop signs at low speeds and disobeying other traffic laws, both of which were to be rectified through online software updates.

Critics argue that Tesla’s system, which relies solely on cameras to detect hazards, lacks the appropriate sensors to achieve full self-driving capabilities. In contrast, nearly all other companies developing autonomous vehicles utilize radar and laser sensors alongside cameras to improve visibility in low-light or poor conditions. Musk has defended the approach, asserting that humans rely solely on sight while driving, claiming that cars should similarly navigate using cameras alone. He has described lidar (light detection and ranging), which employs lasers to identify objects, as a “fool’s errand.”

The “Full Self-Driving” recalls followed a three-year investigation into Tesla’s less sophisticated Autopilot system, which had been involved in collisions with parked emergency vehicles, many of which had flashing warning lights. That investigation concluded last April after NHTSA pressured Tesla into recalling its vehicles to strengthen a system that ensured drivers remained attentive. A few weeks after the recall, NHTSA initiated an investigation into the effectiveness of the recall.

NHTSA began its Autopilot crash investigation in 2021 after receiving 11 reports of Teslas using Autopilot colliding with parked emergency vehicles. In the documentation explaining the investigation’s closure, NHTSA revealed that it identified 467 crashes involving Autopilot, resulting in 54 injuries and 14 fatalities. Autopilot serves as a sophisticated version of cruise control, while “Full Self-Driving” is marketed by Musk as capable of operating without human intervention.

The recent investigation marks a new phase for NHTSA, which had previously viewed Tesla’s systems as assistive rather than fully autonomous. The current inquiry focuses on the capabilities of “Full Self-Driving,” shifting the emphasis from driver attention to the system’s ability to detect safety hazards.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, noted that the previous investigation did not consider why Teslas failed to recognize and stop for emergency vehicles. “Previously, they were kind of putting the onus on the driver rather than the car,” he explained. “Now they’re asserting that these systems are incapable of appropriately detecting safety hazards, regardless of whether the drivers are paying attention.”

Leave a comment

This website uses cookies to improve your web experience.