Teslas Full Self-Driving Beta Appears to Have Caused Its First Major Crash – Gizmodo

Image for article titled Tesla's 'Full Self-Driving' Beta Appears to Have Caused Its First Major Crash

Photo: Spencer Platt (Getty Images)

A new complaint filed with the National Highway Traffic Safety Administration (NHTSA) details what appears to be the first major crash involving a Tesla using the Full Self-Driving (FSD) beta. The crash report, viewed by The Verge, comes just one week after the company was forced to recall 11,704 vehicles Over an FSD-related glitch.

According to the report, the crash allegedly involved a Model Y vehicle in FSD mode that crashed in Brea, California, on November 3 after mistakenly turning into the wrong lane. The Tesla was then struck, leading to the car being “severely damaged” on the driver’s side. Nobody was injured in the crash, according to the report.

Gizmodo reached out to both Tesla and the NHTSA for more details on the report but has not heard back.

Car safety experts and regulators have been sounding alarm bells all year warning of safety concerns related to Full Self-Driving. In July, Consumer Reports warned FSD lacked adequate safeguards that were leading Tesla cars to miss turns, scrape against bushes, and in some cases, hurl themselves towards parked cars. Consumer Reports warned FSD safety lapses posed a danger not only to Tesla drivers themselves, but also to pedestrians, cyclists, and other motorists.

Just a few months later, U.S. National Transportation Safety Board Chairwoman Jennifer Homendy criticized the company for letting drivers request access to the service before it had overcome what the agency viewed as “design shortcomings.” Homendy doubled down on those critics in an interview with CNBC in late October, and claimed Tesla’s description of FSD as “self-driving” was “misleading,” and potentially encouraged users to use the service irresponsibly—i.e. to, as the name suggests, let the car drive itself.

As a reminder, Tesla has admitted its FSD only achieves Level 2 autonomy, which is measured on a scale of 1 to 5. To this point, Democratic Sens. Richard Blumenthal and Edward Markey have urged the FTC to investigate whether Tesla’s FSD categorization amounts to false advertising.

All of this is to say that, it was simply a matter of time before there was a major crash involving FSD, something Musk himself acknowledged in a tweet earlier this year where he told drivers to, “please be paranoid,” acknowledging that the FSD beta will bring “unknown issues.”

The big question now is what comes next. All signs have been pointing towards an increased appetite for driver assistance and autonomous vehicle regulation this year, but so far, nothing concrete has materialized. It’s almost guaranteed we will see more crashes like the one detailed here involving FSD as the beta expands to even larger audiences. Any potential injuries resulting from these could be enough to force regulators to act.

Leave a comment

Your email address will not be published. Required fields are marked *