Tesla faces allegations of covering up accident records related to its self-driving technology
Amid controversies over its Autopilot advertising
Tesla is accused of covering up accident records while using Autopilot. The Wall Street Journal (WSJ) reported that Tesla and the National Highway Traffic Safety Administration (NHTSA) withheld key information about reported crashes, such as specific details and the dates of the incidents.
A WSJ analysis of more than 200 recorded crashes involving Tesla’s Autopilot reveals a previously unknown pattern of crashes. Sometimes, the autopilot fails to avoid obstacles on the road or even goes off the road.
Tesla quickly responds to claims that accident records are confidential
In 2016, Tesla CEO Elon Musk promoted the autonomous driving function of Tesla vehicles, saying that the cars can drive on city streets, onto highways, and then find parking spaces. With the launch of FSD (Full Self-Driving) scheduled for 2022, the company is being criticized for misleading people into thinking that autonomous driving is perfect by announcing that it will allow them to “The car will be able to take you from your home to your work, your friend’s house, the grocery store without you touching the wheel.”
As this controversy arose amid criticism that Tesla over-advertised its Autopilot, Tesla quickly stepped forward to explain itself. Tesla claims it did not intentionally hide the data, saying it is classified as confidential business information. NHTSA also noted that personal information must be protected under US federal law.
Says it will provide accident records if necessary
But access remains difficult
Tesla says it controls access to crash data but will provide the information if needed for police investigations or litigation. However, WSJ‘s position is that obtaining detailed information about what cameras are used for autonomous driving records and how the recorded footage is processed without a vehicle computer expert regarding Tesla’s claims would be difficult.
It’s unclear whether Tesla deliberately covered up its accident history, but the investigation has led to criticism that Autopilot isn’t as complete as Tesla has advertised. Since early 2024, Tesla has been distributing FSD for free for one month to Tesla users in the US, but a series of accidents occurred while using the function.
Accidents Continue to Happen While Using FSD
Negative Impact on Tesla’s Core Business
A Tesla owner in the US posted a video of the accident and a damaged wheel, saying that his vehicle scraped a curb while turning right while the FSD function was on. Also, there have been continued criticisms that drivers had to intervene more than necessary to avoid narrow roads or parked vehicles.
Tesla continues to develop autonomous driving technology as one of its core businesses and plans to introduce Level 5 fully autonomous driving technology that will allow the car to operate without driver intervention. As the controversy over Tesla’s autonomous driving function will likely negatively impact Tesla’s future core business, attention is also being paid to Tesla’s response.
Most Commented