The Dawn Project Releases Video Evidence Allegedly Showing Teslas Misbehaving


As a child-sized dummy sits in a lane of traffic on a two-lane rural highway, a Tesla in fully self-driving mode hurtles toward it. At the wheel: a giant teddy bear. The vehicle's driver monitoring system does not issue any warning. The front end hits the dummy and sends it flying into the air. And the car continues driving as if nothing had happened.

It's the latest salvo from activist organization Dawn Project, which posts videos aimed at showing how badly Tesla's automated driving technology can behave. Dan O'Dowd, the wealthy, tech-savvy activist who founded and self-funds Project Dawn, said he wants to ensure that “the safety-critical systems that everyone's lives depend on are fail-safe and cannot be hacked. “

Although O'Dowd's stated goal is brand-agnostic, his main target since the launch of Project Dawn in 2021 has been Tesla and its controversial Autopilot and full self-driving systems. So far, he said, he has spent $10 million on a campaign to persuade Tesla to fix what he sees as its safety problems, and to push for government regulations if it doesn't.

(Video courtesy of Project Dawn).

O'Dowd, 67, is an expert in secure systems. A Caltech-trained engineer, he has designed military-grade operating systems for intercontinental missiles, fighter jets, bombers, and airliners, as well as microprocessors for NASA spacecraft. He made money through Green Hills Software, a company he founded in 1982 that develops failsafe operating systems and programming tools for commercial clients.

He is personally offended, he said, because what he calls Tesla's half-assed automated systems are being allowed on country roads before serious faults have been resolved through off-road testing. It notes that since 2021, at least 23 people have died in accidents involving autopilot or full self-driving systems. That count comes from the National Highway Traffic Safety Administration, which also reports 840 Autopilot/FSD-related accidents during the same period.

Autopilot is similar to driver assistance programs sold by other automakers and offers automated cruise control, lane keeping, and lane changes. Full self-driving, a much more expensive option, is marketed as a $15,000 system that can handle city traffic, including traffic lights and turning at intersections. Because autonomous driving requires human attention and therefore does not drive itself, the California Department of Motor Vehicles says it is investigating whether Tesla is fraudulently marketing the product.

O'Dowd said that after sending videos to NHTSA, he spoke with several NHTSA officials via Zoom, who said they had an ongoing investigation but could not discuss it until it was complete.

When asked for comment, NHTSA reiterated the message: “NHTSA's Autopilot investigation remains open and the agency generally does not comment on open investigations.”

O'Dowd admits that some skepticism about his evidence is to be expected. To counter this, he has invited the NHTSA, the DMV, Elon Musk and Tesla itself to replicate his tests. He has asked everyone to visit Project Dawn in Santa Barbara to evaluate his team's methods and to bring their engineers to see if there is any strange business going on. So far, there are no takers.

(Video courtesy of Project Dawn).

The 'teddy bear test'

Critics on social media have accused O'Dowd of applying his expertise to manipulate the car's software and hardware and falsify defects. In addition to NHTSA, Tesla and the DMV, he has invited Tesla fans to Project Dawn to evaluate the methodology.

In June, O'Dowd teamed up with noted Tesla fan Ross Gerber, a Santa Monica investment manager, to take a ride together in Gerber's FSD-equipped Tesla. At one point, FSD ignored a stop sign and Gerber had to brake hard to avoid a collision.

Another fan, John Bernal, a former Tesla employee who still drives a Model 3, performed the teddy bear test. (Bernal says he was fired after posting videos critical of FSD.)

Bernal drove his car on a private road with no traffic around. He set up the child model and then had two giant stuffed animals take turns “driving”: a teddy bear and a pink unicorn. He put a 30-pound Russian ball in the driver's seat, tricking the car into thinking there was a person there, and placed a weight on the steering wheel, available for purchase on the Internet, designed to imitate a human hand on the steering wheel, which The owner's manual says it is mandatory.

With a Project Dawn driver in the passenger seat, Bernal put the car in motion. As in previous videos, the Tesla gave cream to the child mannequin. In one case, he slowed down for a few seconds after the collision and then kept going. Another video shows a Project Dawn driver fooling Tesla's monitoring system by wearing sunglasses; He works on a laptop for a while and then collapses and pretends to fall asleep while the car drives on.

Musk and Tesla did not respond to requests for comment on the video. In the past, Musk has tweeted that O'Dowd and his critics are “crazy.”

Like cars from other manufacturers with automated driving systems, such as Ford and its BlueCruise, GM and its Super Cruise and BMW and its Driving Assistant Plus, later Tesla models are equipped with what is called a vehicle monitoring system. driver. These systems incorporate sensors on the windshield to monitor drivers and make sure they are paying attention to the road. Otherwise, the driver will hear a series of warnings and eventually, if he concludes that you are not paying attention, the car will stop. Bernal said that in another test the system went 10 minutes before issuing a warning.

Tesla's monitoring system is simple compared to others on the market. Tesla uses just one camera, while most others add infrared sensors that allow for more sophisticated analysis of subtle behaviors like gaze and head movement.

Colin Barndem, who follows the driver-tracking market for Semicast Research in London, said he ranks Tesla's system “as the worst driver-tracking system on the market.” The last place… without infrared illumination, is as close to useless as possible.”

The United States does not set standards for driver monitoring systems, although NHTSA is investigating the issue. Barndem points out that Europe already has such standards and that “the United States is five years behind Europe.”

“NHTSA's goal is to save lives,” Barndem said. “If they are not going to follow the European example and establish minimum standards, [loss of life] “That's what's going to happen.”

NHTSA is currently investigating Tesla for a number of alleged defects, including sudden acceleration; sudden braking; steering wheel malfunction; and crash into police cars, fire trucks and ambulances while on autopilot.



scroll to top