Tesla's 'Full Self-Driving' Beta Appears to Have Caused Its First Major Crash

An NHTSA complaint alleges a Telsa Model Y with FSD engaged was "severely damaged" after mistakenly turning into the wrong lane and colliding with another car.

We may earn a commission from links on this page.
Image for article titled Tesla's 'Full Self-Driving' Beta Appears to Have Caused Its First Major Crash
Photo: Spencer Platt (Getty Images)

A new complaint filed with the National Highway Traffic Safety Administration (NHTSA) details what appears to be the first major crash involving a Tesla using the Full Self-Driving (FSD) beta. The crash report, viewed by The Verge, comes just one week after the company was forced to recall 11,704 vehicles Over an FSD-related glitch.

According to the report, the crash allegedly involved a Model Y vehicle in FSD mode that crashed in Brea, California, on November 3 after mistakenly turning into the wrong lane. The Tesla was then struck, leading to the car being “severely damaged” on the driver’s side. Nobody was injured in the crash, according to the report.

Advertisement

In a statement sent to Gizmodo, an NHTSA spokesperson said they are aware of the complaint and are in talks with Tesla to gather more information. Tesla did not respond to Gizmodo’s request for comment.

Advertisement

Car safety experts and regulators have been sounding alarm bells all year warning of safety concerns related to Full Self-Driving. In July, Consumer Reports warned FSD lacked adequate safeguards that were leading Tesla cars to miss turns, scrape against bushes, and in some cases, hurl themselves towards parked cars. Consumer Reports warned FSD safety lapses posed a danger not only to Tesla drivers themselves, but also to pedestrians, cyclists, and other motorists.

Advertisement

Just a few months later, U.S. National Transportation Safety Board Chairwoman Jennifer Homendy criticized the company for letting drivers request access to the service before it had overcome what the agency viewed as “design shortcomings.” Homendy doubled down on those critics in an interview with CNBC in late October, and claimed Tesla’s description of FSD as “self-driving” was “misleading,” and potentially encouraged users to use the service irresponsibly—i.e. to, as the name suggests, let the car drive itself.

As a reminder, Tesla has admitted its FSD only achieves Level 2 autonomy, which is measured on a scale of 1 to 5. To this point, Democratic Sens. Richard Blumenthal and Edward Markey have urged the FTC to investigate whether Tesla’s FSD categorization amounts to false advertising.

Advertisement

All of this is to say that, it was simply a matter of time before there was a major crash involving FSD, something Musk himself acknowledged in a tweet earlier this year where he told drivers to, “please be paranoid,” acknowledging that the FSD beta will bring “unknown issues.”

Advertisement

The big question now is what comes next. All signs have been pointing towards an increased appetite for driver assistance and autonomous vehicle regulation this year, but so far, nothing concrete has materialized. It’s almost guaranteed we will see more crashes like the one detailed here involving FSD as the beta expands to even larger audiences. Any potential injuries resulting from these could be enough to force regulators to act.

Update 7:30 a.m. ET: Added comment from the NHTSA.

Advertisement