The horrifying revelations in The Tesla Files


The Tesla company is very secretive about the cars it produces. In particular, it is very reluctant to release information about their safety records. While the company blocks attempts to release records on its crash records, a whistleblower has released what is being called The Tesla files based on internal records. The Guardian has released an edited extract from The Tesla Files by Sönke Iwersen and Michael Verfürden that was published on 24 July that reveals horrifying details about the kinds of crashes that Teslas have been involved in and how that information is suppressed. The article says that what the files reveal is that it is perfectly reasonable to be mortally afraid of these cars.

The most disturbing thing that I read was that Tesla collects vats amount of real-time data from its cars all over the world but the existence of this data was being kept secret from regulators. When any of the cars crash, this data would be invaluable to investigators looking for the cause but many do not ask for it, presumably because they are unaware that the data exist and even when they ask, the company stonewalls, leaving the victims of the crashes and their families frustrated.

One of the controversial features that Elon Musk likes to boast about is that he is producing self-driving cars but they do not live up the hype.

Autonomous driving is the core promise around which Elon Musk has built his company. Tesla has never delivered a truly self-driving vehicle, yet the richest person in the world keeps repeating the claim that his cars will soon drive entirely without human help. Is Tesla’s autopilot really as advanced as he says?

The Tesla Files suggest otherwise. They contain more than 2,400 customer complaints about unintended acceleration and more than 1,500 braking issues – 139 involving emergency braking without cause, and 383 phantom braking events triggered by false collision warnings. More than 1,000 crashes are documented. A separate spreadsheet on driver-assistance incidents where customers raised safety concerns lists more than 3,000 entries. The oldest date from 2015, the most recent from March 2022. In that time, Tesla delivered roughly 2.6m vehicles with autopilot software. Most incidents occurred in the US, but there have also been complaints from Europe and Asia. Customers described their cars suddenly accelerating or braking hard. Some escaped with a scare; others ended up in ditches, crashing into walls or colliding with oncoming vehicles. “After dropping my son off in his school parking lot, as I go to make a right-hand exit it lurches forward suddenly,” one complaint read. Another said, “My autopilot failed/malfunctioned this morning (car didn’t brake) and I almost rear-ended somebody at 65mph.” A third reported, “Today, while my wife was driving with our baby in the car, it suddenly accelerated out of nowhere.”

Braking for no reason caused just as much distress. “Our car just stopped on the highway. That was terrifying,” a Tesla driver wrote. Another complained, “Frequent phantom braking on two-lane highways. Makes the autopilot almost unusable.” Some report their car “jumped lanes unexpectedly”, causing them to hit a concrete barrier, or veered into oncoming traffic.

A really disturbing feature is what happens when the cars catch fire after crashing, as lithium batteries are prone to do. The doors cannot be opened and firefighters watch helplessly, unable to do anything, as the people inside burn to death. This is due to a basic design flaw involving the retractable door handles that Musk insisted upon.

Elon Musk is a perfectionist with a tendency towards micromanagement. At Tesla, his whims seem to override every argument – even in matters of life and death. During our reporting, we came across the issue of door handles. On Teslas, they retract into the doors while the cars are being driven. The system depends on battery power. If an airbag deploys, the doors are supposed to unlock automatically and the handles extend – at least, that’s what the Model S manual says.

The idea for the sleek, futuristic design stems from Musk himself. He insisted on retractable handles, despite repeated warnings from engineers. Since 2018, they have been linked to at least four fatal accidents in Europe and the US, in which five people died.

In February 2024, we reported on a particularly tragic case: a fatal crash on a country road near Dobbrikow, in Brandenburg, Germany. Two 18-year-olds were killed when the Tesla they were in slammed into a tree and caught fire. First responders couldn’t open the doors because the handles were retracted. The teenagers burned to death in the back seat.

This led to an extraordinary recommendation.

Germany’s largest automobile club, ADAC, issued a public recommendation that Tesla drivers.should carry emergency window hammers. In a statement, ADAC warned that retractable door handles could seriously hinder rescue efforts. Even trained emergency responders, it said, may struggle to reach trapped passengers. Tesla shows no intention of changing the design.

The idea that people should routinely carry hammers big enough to break windows to escape is mind-boggling. Even with one, some people may not have the strength to break the glass, given how little room there is inside a car to take a big swing. And then add the danger from the shattered glass.


That’s Musk. He prefers the sleek look of Teslas without handles, so he accepts the risk to his customers. His thinking, it seems, goes something like this: at some point, the engineers will figure out a technical fix. The same logic applies to his grander vision of autonomous driving: because Musk wants to be first, he lets customers test his unfinished Autopilot system on public roads. It’s a principle borrowed from the software world, where releasing apps in beta has long been standard practice. The more users, the more feedback and, over time – often years – something stable emerges. Revenue and market share arrive much earlier. The motto: if you wait, you lose.

What is worse is that due to Tesla’s secrecy, the victims are deprived of knowledge of the cause of the crash that killed their loved ones, being given very superficial information. The article describes one woman whose husband died in such a fire.

It was a Monday afternoon in June 2023 when Rita Meier, 45, joined us for a video call. Meier told us about the last time she said goodbye to her husband, Stefan, five years earlier. He had been leaving their home near Lake Constance, Germany, heading for a trade fair in Milan.

Meier recalled how he hesitated between taking his Tesla Model S or her BMW. He had never driven the Tesla that far before. He checked the route for charging stations along the way and ultimately decided to try it. Rita had a bad feeling. She stayed home with their three children, the youngest less than a year old.

At 3.18pm on 10 May 2018, Stefan Meier lost control of his Model S on the A2 highway near the Monte Ceneri tunnel. Travelling at about 100kmh (62mph), he ploughed through several warning markers and traffic signs before crashing into a slanted guardrail. “The collision with the guardrail launches the vehicle into the air, where it flips several times before landing,” investigators would write later.

The car came to rest more than 70 metres away, on the opposite side of the road, leaving a trail of wreckage. According to witnesses, the Model S burst into flames while still airborne. Several passersby tried to open the doors and rescue the driver, but they couldn’t unlock the car. When they heard explosions and saw flames through the windows, they retreated. Even the firefighters, who arrived 20 minutes later, could do nothing but watch the Tesla burn.

Meier’s account was structured and precise. Only once did the toll become visible – when she described how her husband’s body burned in full view of the firefighters. Her eyes filled with tears and her voice cracked. She apologised, turning away.

To this day, Meier still doesn’t know why her husband died. She has kept everything the police gave her after their inconclusive investigation. The charred wreck of the Model S sits in a garage Meier rents specifically for that purpose. The scorched phone – which she had forensically analysed at her own expense, to no avail – sits in a drawer at home. Maybe someday all this will be needed again, she says. She hasn’t given up hope of uncovering the truth.

Rita Meier wasn’t the only widow to approach us. Disappointed customers, current and former employees, analysts and lawyers were sharing links to our reporting. Many of them contacted us. More than once, someone wrote that it was about time someone stood up to Tesla – and to Elon Musk.

Meier, too, shared our articles and the callout form with others in her network – including people who, like her, lost loved ones in Tesla crashes. One of them was Anke Schuster. Like Meier, she had lost her husband in a Tesla crash that defies explanation and had spent years chasing answers. And, like Meier, she had found her husband’s Model X listed in the Tesla Files. Once again, the incident was marked as resolved – with no indication of what that actually meant.

“My husband died in an unexplained and inexplicable accident,” Schuster wrote in her first email. Her dealings with police, prosecutors and insurance companies, she said, had been “hell”. No one seemed to understand how a Tesla works. “I lost my husband. His four daughters lost their father. And no one ever cared.”

Her husband, Oliver, was a tech enthusiast, fascinated by Musk. A hotelier by trade, he owned no fewer than four Teslas. He loved the cars. She hated them – especially the autopilot. The way the software seemed to make decisions on its own never sat right with her. Now, she felt as if her instincts had been confirmed in the worst way.

We searched for other crashes. One involved Hans von Ohain, a 33-year-old Tesla employee from Evergreen, Colorado. On 16 May 2022, he crashed into a tree on his way home from a golf outing and the car burst into flames. Von Ohain died at the scene. His passenger survived and told police that von Ohain, who had been drinking, had activated Full Self-Driving. Tesla, however, said it couldn’t confirm whether the system was engaged – because no vehicle data was transmitted for the incident.

Then, in February 2024, Musk himself stepped in. The Tesla CEO claimed von Ohain had never downloaded the latest version of the software – so it couldn’t have caused the crash. Friends of von Ohain, however, told US media he had shown them the system. His passenger that day, who barely escaped with his life, told reporters that hours earlier the car had already driven erratically by itself. “The first time it happened, I was like, ‘Is that normal?’” he recalled asking von Ohain. “And he was like, ‘Yeah, that happens every now and then.’”

His account was bolstered by von Ohain’s widow, who explained to the media how overjoyed her husband had been at working for Tesla. Reportedly, von Ohain received the Full Self-Driving system as a perk. His widow explained how he would use the system almost every time he got behind the wheel: “It was jerky, but we were like, that comes with the territory of new technology. We knew the technology had to learn, and we were willing to be part of that.”

The Colorado State Patrol investigated but closed the case without blaming Tesla. It reported that no usable data was recovered.

There are other examples of what Tesla’s data collection makes possible. We found the case of David and Sheila Brown, who died in August 2020 when their Model 3 ran a red light at 114mph in Saratoga, California. Investigators managed to reconstruct every detail, thanks to Tesla’s vehicle data. It shows exactly when the Browns opened a door, unfastened a seatbelt, and how hard the driver pressed the accelerator – down to the millisecond, right up to the moment of impact. Over time, we found more cases, more detailed accident reports. The data definitely is there – until it isn’t.

Security researchers were able hack into the Tesla system and largely figured out what was going on.

We talked to independent experts at the Technical University Berlin. Three PhD candidates – Christian Werling, Niclas Kühnapfel and Hans Niklas Jacob – made headlines for hacking Tesla’s autopilot hardware. A brief voltage drop on a circuit board turned out to be just enough to trick the system into opening up.

The security researchers uncovered what they called “Elon Mode” – a hidden setting in which the car drives fully autonomously, without requiring the driver to keep his hands on the wheel. They also managed to recover deleted data, including video footage recorded by a Tesla driver. And they traced exactly what data Tesla sends to its servers – and what it doesn’t.

The hackers explained that Tesla stores data in three places. First, on a memory card inside the onboard computer – essentially a running log of the vehicle’s digital brain. Second, on the event data recorder – a black box that captures a few seconds before and after a crash. And third, on Tesla’s servers, assuming the vehicle uploads them.

The researchers told us they had found an internal database embedded in the system – one built around so-called trigger events. If, for example, the airbag deploys or the car hits an obstacle, the system is designed to save a defined set of data to the black box – and transmit it to Tesla’s servers. Unless the vehicles were in a complete network dead zone, in both the Meier and Schuster cases, the cars should have recorded and transmitted that data.

The researchers found that Tesla was very sparing with the data it gave to investigators.

In many crashes when Teslas inexplicably veered off the road or hit stationary objects, investigators didn’t actually request data from the company. When we asked authorities why, there was often silence. Our impression was that many prosecutors and police officers weren’t even aware that asking was an option. In other cases, they acted only when pushed by victims’ families.

The Schuster case played out similarly. Prosecutors in Stralsund, Germany, were baffled. The road where the crash happened is straight, the asphalt was dry and the weather at the time of the accident was clear. Anke Schuster kept urging the authorities to examine Tesla’s telemetry data.

When prosecutors did formally request the data recorded by Schuster’s car on the day of the crash, it took Tesla more than two weeks to respond – and when it did, the answer was both brief and bold. The company didn’t say there was no data. It said that there was “no relevant data”. The authorities’ reaction left us stunned. We expected prosecutors to push back – to tell Tesla that deciding what’s relevant is their job, not the company’s. But they didn’t. Instead, they closed the case.

The hackers from TU Berlin pointed us to a study by the Netherlands Forensic Institute, an independent division of the ministry of justice and security. In October 2021, the NFI published findings showing it had successfully accessed the onboard memories of all major Tesla models. The researchers compared their results with accident cases in which police had requested data from Tesla. Their conclusion was that while Tesla formally complied with those requests, it omitted large volumes of data that might have proved useful.

There’s more. Two years prior, the NHTSA had flagged something strange – something suspicious. In a separate report, it documented 16 cases in which Tesla vehicles crashed into stationary emergency vehicles. In each, autopilot disengaged “less than one second before impact” – far too little time for the driver to react. Critics warn that this behaviour could allow Tesla to argue in court that autopilot was not active at the moment of impact, potentially dodging responsibility.

This is the background to the following video.

The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a range of hazardous situations, in which the Model Y performed significantly worse than a competing vehicle. The Tesla repeatedly ran over a crash-test dummy without braking. The video went viral, amassing more than 14m views within a few days.

The first part describes how he used a LiDAR (Light Detection and Ranging) scanner to figure out the shape of the Space Mountain ride in Disneyland. After about the 8:00 minute mark, he compares two semi-autonomous vehicles, one using LIDAR and the other a Tesla that uses ordinary cameras, to see how they fare when confronted with obstacles. The results are terrifying.

I will never buy a Tesla.

Comments

  1. Matt G says

    I have heard (famous last words) that Tesla’s autopilot shuts down moments before an imminent crash so Tesla can claim it was the driver’s fault. Someone here will correct me if I’m wrong.

  2. Mano Singham says

    Matt G,

    You are not wrong. The post has a passage that says exactly that.

    Two years prior, the NHTSA had flagged something strange – something suspicious. In a separate report, it documented 16 cases in which Tesla vehicles crashed into stationary emergency vehicles. In each, autopilot disengaged “less than one second before impact” – far too little time for the driver to react. Critics warn that this behaviour could allow Tesla to argue in court that autopilot was not active at the moment of impact, potentially dodging responsibility.

  3. Militant Agnostic says

    I will never buy a Tesla.

    I always thought that it would be much better to buy an EV from a company that had a track record of designing and building conventional cars than from a company run by software “engineers”.

  4. johnson catman says

    The idea that people should routinely carry hammers big enough to break windows to escape is mind-boggling. Even with one, some people may not have the strength to break the glass, given how little room there is inside a car to take a big swing.

    You don’t need a big hammer. An example is the Amazon Basics Emergency Seat Belt Cutter and Window Hammer Tool, product dimensions 7.52″L x 2.84″W x 1.37″Th. It has a sharp point to shatter the glass, then you kick it out. It also has a seat belt cutter.

  5. flex says

    #4, johnson catman beat me to it, but remember that most of the window hammer tools are good for tempered glass and not laminated glass. Your windshield is laminated glass, your door glass is tempered glass, so use the tool on your door glass.

    That’s usually the case, I have no idea what is used on Teslas, or what is used in sunroofs.

  6. Mano Singham says

    The very fact that we are having this discussion, about what kind of glass a car uses and what kind of hammer would be best for breaking the windows and escaping, illustrates how surreal the situation is.

    Ordinary people should not have to even think about this.

  7. Jazzlet says

    Mythbusters fans on the other hand do know this due to the episode on how to get out of a car that has ended up in deep water. I still don’t keep the appropriate tool in my car as I don’t think I’d be able to actually do what you are supposed do, partly because odds are I’d be trying to get my dog out as well as me, and there isn’t that much time for such activity.

  8. lanir says

    I own a Tesla Model 3. For various reasons mostly centering around poor decisions by Musk, I won’t be replacing it with another.

    If we’re talking safety I think it’s useful to talk about all of the info available. I’ll add what I know here and try keep it focused on the facts.

    I was in an accident in my Tesla this February. The vehicle had recorded the whole trip onto the local USB drive. It also recorded a separate clip of one minute from the external cameras on all 4 sides with the impact occuring halfway through. I have not hacked the car so I have no information on what other data is available or what was sent to the company.

    OPINION: This seems useful to me, although it could use a more friendly user interface. I found it by shuffling through files on the USB and guessing based on date and time in the names.

    This next bit is not unique to Tesla. All electric vehicles weigh more and have a lower center of gravity because the batteries tend to be underneath. The combination of these two factors can make current guardrails significantly less effective at stopping them. I found two videos from early 2024 about this.

    This video shows a Rivian truck plowing straight through a standard metal barrier and jumping over a concrete barrier beyond it. The Rivian is going 60 and it is driven straight at the guardrails and barriers.

    https://www.youtube.com/watch?v=T_PsypZTxlw

    This second video shows a Tesla Model 3 impacting a guardrail at an angle. Because the center of gravity is lower than the vehicles the guardrail was designed for, the guardrail slides right over the Tesla. After this happens, the only stopping power remaining in the guardrail appears to come from the posts the guardrail was mounted on.

    https://www.youtube.com/watch?v=ZLwMroMmpC4

    OPINION: I’m not sure how useful the first video is. Very heavy thing impacting a safety feature head on where it’s weakest at high speed? I’m not sure who would expect that to end well. I don’t think I did. The second is more concerning. It looked to me like the guardrail’s effectiveness was reduced to just having a few posts in the ground. The guardrail is actually designed so the vehicle slides along it rather than under it. Sliding along it allows it to distribute the force along its length to all of the posts rather than just the ones the vehicle directly impacts. This changes how much stopping power the guardrail has but more importantly I think, it greatly affects the trajectory of the vehicle and where it ends up after the impact.

Leave a Reply

Your email address will not be published. Required fields are marked *