A investigation entitled “The Tesla Files,” picked up by The Guardian, has raised serious concerns about the safety of Tesla vehicles. Two German journalists collected thousands of documents and testimonies to investigate the numerous technical anomalies involving the company’s assisted driving system.
Despite Elon Musk’s stated obsession with design and innovation, his cars have been involved in hundreds of accidents, some of them fatal. Drivers and passengers have been trapped in burning vehicles or involved in sudden high-speed braking.
One of the most emblematic cases is that of Stefan Meier, who died in 2018 while driving a Tesla Model S in Switzerland. The car lost control, knocked down road signs and crashed into a guardrail, flipping over several times. The car caught fire in the air, and rescuers were unable to open the doors to save the driver. His wife, Rita Meier, still does not know the cause of the accident.
The heart of the investigation concerns Tesla’s autopilot system, advertised as “Full Self-Driving” (FSD). Despite the name, it is a driver assistance system that requires constant driver supervision. However, leaked internal documents show how Tesla has received more than 2,400 complaints about sudden acceleration; 1,500 braking-related problems, including 383 for “phantom braking” (caused by false collision warnings); and more than 1,000 documented accidents; and more than 3,000 reports from customers concerned about the safety of the system.
Most of these last come from the United States, but there are reports from Europe and Asia as well. Customers recount disturbing incidents of cars accelerating or braking on their own, systems failing to respond to respond, and accidents narrowly avoided.
An internal report also reveals that a British engineer in charge of safety had signaled the need for clearer protocols. But U.S. leadership reportedly objected, fearing legal repercussions. Some documents show that Tesla would deliberately avoid filing certain issues precisely so as not to risk subpoenas.
In 2024, the National Highway Traffic Safety Administration confirmed that Tesla was not adequately monitoring whether drivers remained alert while using autopilot.
Despite this, Musk continues to promise that Teslas will soon drive themselves. But between accidents, troubling data, and questionable internal management, that futuristic vision now seems more marketing than concrete reality.