After a day spent fishing in Key Largo, Florida, Dillon Angulo and Naibel Benavides Leon stopped their Chevy Tahoe on the side of the road to gaze at the stars. Suddenly, Angulo described the moment when the “whole world just fell down.”
A Tesla operating on Autopilot sped through a T intersection at around 70 mph, propelling the young couple into the air. Tragically, Benavides Leon lost her life, while Angulo suffered severe injuries. Police body-camera footage obtained by The Washington Post captured the shaken driver explaining that he was “driving on cruise” and took his eyes off the road when he dropped his phone.
The 2019 crash goes beyond mere driver inattention; it unfolded on a rural road where Tesla’s Autopilot technology wasn’t designed for use. Exclusive dash-cam footage from the Tesla, obtained by The Post, revealed the car ignoring a “stop sign,” a blinking light, and five yellow signs indicating the road’s end with a mandatory turn.
This crash is just one of at least eight fatal or serious accidents involving Tesla Autopilot on roads where the driver assistance software wasn’t intended to be reliably operational. A comprehensive analysis by The Post, considering federal databases, legal records, and public documents, brought these incidents to light. The first such crash occurred in 2016 when a Tesla collided with a semi-truck on a Florida U.S. route, resulting in fatalities.
While Tesla acknowledges in user manuals, legal documents, and communications with regulators that Autosteer, the key feature of Autopilot, is “intended for use on controlled-access highways” with specific conditions, little has been done to restrict the software’s use. Despite the technical capability to limit Autopilot geographically, Tesla has taken minimal steps in that direction.
Federal regulators have also failed to act decisively. Following the 2016 crash, the National Transportation Safety Board (NTSB) called for restrictions on where driver-assistance technology, like Tesla’s Autopilot, could be activated. However, the NTSB lacks regulatory authority over Tesla. The National Highway Traffic Safety Administration (NHTSA), part of the Department of Transportation, possesses regulatory power but has not taken sufficient action, leading to a strained relationship between the two agencies.
Jennifer Homendy, Chair of the NTSB, expressed frustration, stating that the 2016 crash should have prompted NHTSA to establish enforceable rules regarding where Tesla’s technology can be activated. She criticized the inaction as a “real failure of the system” and questioned NHTSA’s prioritization of safety. The lack of decisive action has raised concerns about the consequences of allowing rapidly advancing technology on roads without robust government oversight.
In response, NHTSA emphasized its commitment to safety but argued that verifying the appropriate conditions for technology like Tesla Autopilot would be complex and resource-intensive. Homendy questioned this explanation, suggesting that agencies and industries often cite the impossibility of requests until further incidents force action.
Tesla, in court cases and public statements, consistently claims it is not liable for Autopilot-related crashes, contending that the driver bears ultimate responsibility for the car’s trajectory. The series of Autopilot crashes underscores the risks of allowing a rapidly advancing technology to operate on roads without stringent government oversight.
The contrasting approach to federal regulation of planes and railroads, where new technology or equipment issues prompt swift action, has come under scrutiny. Unlike planes, which undergo a rigorous certification process, passenger car models are not prescreened but are subject to Federal Motor Vehicle Safety Standards. Critics argue that NHTSA’s reactive approach has allowed flawed technology to endanger Tesla drivers and those around them.
The situation also differs from how states, localities, and some companies respond to incidents involving autonomous vehicles. Uber, for example, temporarily halted its driverless program after a fatal crash in 2018, and the California Department of Motor Vehicles suspended permits for Cruise after a pedestrian incident.
While the NTSB has consistently called for restrictions on Autopilot, both Tesla and federal regulators have been slow to implement substantial changes. The NTSB’s appeals for “sensible safeguards, protocols, and minimum performance standards” have not resulted in decisive action. The lack of enforceable rules around where Tesla’s technology can be activated remains a contentious issue, with the NHTSA focusing on ensuring drivers remain fully engaged while using advanced driver-assistance systems.
The series of fatal crashes involving Tesla Autopilot highlights the challenges of balancing innovation with safety and the need for comprehensive regulatory frameworks to govern evolving technologies on our roads.