
We’re glad that Tesla has finally acknowledged that the flaws we found in our tests are true and real. To that extent, we have achieved our goal. - Dan O'Dowd
SANTA BARBARA, Calif. (PRWEB) February 16, 2023
Just days after The Dawn Project’s Super Bowl commercial exposed video of the real-life risks of Tesla’s Full Self-Driving (FSD) software, the automaker just announced the recall of 362,758 vehicles. While this is proof that Dan O’Dowd’s Dawn Project has been right all along, there are still concerns. O’Dowd says we need more details about this recall. Is FSD being replaced with new software that’s also unproved and potentially just as dangerous as FSD?
Dan O’Dowd and The Dawn Project have been alerting the public about the dangers for more than a year. “We’re glad that Tesla has finally acknowledged that the flaws we found in our tests are true and real, O’Dowd says. “To that extent, we have achieved our goal.”
But in the notice on NHTSA’s website, the Tesla recall does not appear to address other serious safety concerns found by O’Dowd, such as:
- Tesla’s FSD ignores stopped school bus signals
- Tesla’s FSD fails to recognize children in crosswalks
The Dawn Project has videos verifying its claims — now backed up by the recall — is both real and truthful.
According to the recall notice published on the National Highway Traffic Safety Administration (NHTSA) website, the FSD Beta system allows the vehicle to behave in an unpredictable manner, increasing the risk of a crash.
VIDEO: We have video proof of software failure that the Tesla recall points out. Specifically, the current FSD beta software:
- May allow the vehicle to act unsafe around intersections, such as:
- Traveling straight through an intersection while in a turn-only lane.
- Entering a stop sign-controlled intersection without coming to a complete stop.
- Proceeding into an intersection during a steady yellow traffic signal without due caution.
- May respond insufficiently to changes in posted speed limits.
- Not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits.
Tesla’s FSD has undermined consumer confidence in self-driving cars at a time when industry and government should be developing systems that build public support for autonomous vehicles, which are clearly the wave of the future in a truly modern, mobile and efficient 21st century transportation network.
The Dawn Project cares about safe technology and believes in technological advancement. Self-driving cars are an achievable dream — but they can only be allowed on our roads if they perform flawlessly and with completely reliable safety.
The Dawn Project is eager to work with elected officials and regulators on putting rules in place to help the government safeguard public safety. The risks embodied in Tesla’s flawed FSD system cannot be allowed to ever happen again — with Tesla or any other company.
Media Availability: Dan O’Dowd, Dawn Project Founder and leading software safety expert
Media Contact:
Marc Silverstein
202 716-9123
marc@onthemarcmedia.com