Goodbye to trust in Elon Musk – a self-driving Tesla crashes, casting doubt on the safety of Autopilot at a time when smart vehicles are booming

The DMV confirms it—North Carolina announces major changes to driver’s license renewals that will affect thousands of people

It’s official—Walmart is rolling out a strategy to reclaim “lost” money from its customers and boost spending

It’s official – the whereabouts of 200,000 radioactive casks hidden in the ocean for eight decades discovered

Tesla once again leads…

the controversies. Yes, it seems that the automotive giant likes the idea of being at the center of a different drama every week. This time, it’s about the Autonomous Model Y, tested on the streets of Austin (Texas). Sounds like just another news story so far, right?

Well, it turns out they have indeed been testing autonomous driving… and they released a video in which it runs over a child. Wait, it’s not a real child, thank goodness!

It’s a staged scene by The Dawn Project,

which used a Model Y and a child dummy to show and denounce the shortcomings of the company’s autonomous system, which, despite being sold as a solution, is actually a problem, and a big one.

The images are chilling (you can see them at the end of this article in the linked tweet), and according to the project’s founder, Dan O’Dowd,

Tesla’s current software is unable to prioritize safety over its ambitious robotaxi plans… meaning it doesn’t react to the possibility of hitting someone.

Is it safe to get into a car without a driver?

The video that set off the alarms

As we said, in this demonstration, a Model Y with FSD (Supervised) 13.2.9 autonomous driving software failed to stop at the red lights of a school bus parked on the shoulder indicating passengers were getting off.

Immediately afterward, the car ran over the child dummy crossing the street without even flinching…

The dilemma of autonomous driving

The worst part is that the car “identified” the pedestrian but didn’t brake or stop after the impact; it just kept going…

And yes, a human driver might also have struggled to stop in time to avoid the dummy, but the law requires stopping for school bus lights precisely because there might be pedestrians crossing. Here, Tesla’s software failed spectacularly…

The Dawn Project and its crusade against Tesla

The Dawn Project, founded by businessman Dan O’Dowd, has been highlighting the risks of Tesla’s autonomous driving for some time. O’Dowd, who owns several Teslas himself, is critical of the company and says they haven’t learned anything since his first public warning in a Super Bowl ad because “they don’t care” and just want their robotaxi running on Austin’s streets at any cost… meaning Tesla and Musk have priorities they won’t change even with these safety failures.

Where does the controversy come from?

Of course, it’s not the first time Tesla has faced criticism over incidents with its software… Back in 2023, the NHTSA investigated a case in which a Model Y hit a student exiting a school bus while the driver was using the Autopilot system. It was found that the driver had manipulated the steering wheel to trick the hand-detection system.

However, Musk has said that they’re testing their system a month earlier than planned and that one of their goals is to get it on the streets (besides launching a spaceship to Mars by 2026, of course).

What sets Tesla apart from other vehicles?

While Waymo (a Google subsidiary) already operates robotaxis in Austin and San Francisco using LiDAR technology, Tesla refuses to use these more expensive sensors, relying instead on its cameras and software… For O’Dowd, this technical decision is a mistake that compromises everyone’s safety on the road (obviously).

O’Dowd insists that quality software could save hundreds of thousands of lives a year. But according to him, Tesla is not the company that’s going to achieve it. “It won’t be Elon. It won’t be Tesla”, he said bluntly.

It’s clear that autonomous driving isn’t easy, but while Tesla boasts about advanced technology, we see that its driving system is still light-years away from ensuring safety. Fortunately, the video is of a fictional crash, but it could become a daily reality if action isn’t taken in time.

Leave a Reply

Your email address will not be published. Required fields are marked *