Tesla’s autonomous driving system is under scrutiny after non-regulatory tests published this week appeared to show repeated failures to detect child-sized dummies.
In research conducted by safe-technology advocacy group The Dawn Project, a 2019 Tesla Model 3 reportedly running the latest version of the brand’s Full Self-Driving (FSD) beta – released in June – repeatedly hits a stationary, child-sized object in the middle of a racetrack.
The claims form part of an ad campaign set to air across the United States, calling for the public to pressure Congress into banning Tesla’s autonomous driving technology.
Funded by California tech CEO and notable Tesla critic, Dan O’Dowd, the advocacy group has been actively testing Tesla’s autonomous driving software since 2021.
O’Dowd attempted to run for Senate earlier this year, with a sole platform to ban Tesla’s self-driving technology.
The test was completed at the Willow Springs International Raceway in California, with the vehicle given 110 metres of straight track between two cones with a small mannequin placed at the end.
At a set speed of 40 miles per hour (64 km/h), the professional driver was told to keep their hands off the steering wheel and only brake if contact is made with the object.
The 32-second video appears to show three instances of the Tesla failing to stop for the mannequin, with two attempts to deviate from the object – and deceleration in each attempt, but not enough to prevent an impact.
Further vision released by The Dawn Project verifies FSD was activated during the tests; however, WhichCar has reached out to the advocacy group for comment on the number of tests completed as its critics claim FSD it was not active at the time and the test results are "inconsistent" with what's shown in the video.
In all three attempts published, the Model 3 repeatedly warns the driver to keep their hands on the steering wheel.
The test driver, Art Haynie, said the vehicle appeared to “start to stagger as if lost and confused, slow down a little, and then speed back up as it hit and ran over the mannequins.”
Haynie has worked for several carmakers, including Porsche, and is also a professional pilot and driving instructor.
Tesla has not publicly responded to the claims.
Despite the name, Tesla states its Full Self-Driving system – which remains in beta – is not designed to replace licensed drivers, instructing them to “keep their hands on the wheel and be prepared to take over at all times.”
According to the American carmaker, more than 100,000 vehicles are currently using the Full Self-Driving beta program.
The Dawn Project claims the test was “designed to simulate a realistic life-and-death situation in which everyday motorists frequently find themselves: a small child walking across the road in a crosswalk.”
It notes the racetrack environment was chosen to ensure there were no other variables, such as moving or parked vehicles, weather, signage, buildings or shadows.
Of note, the independent test was not conducted with the oversight of a regulator, which means it was not subject to identical testing standards.
During testing by safety authority Euro NCAP in 2019, the autonomous emergency braking in the Tesla Model 3 successfully braked for vulnerable road users at 30 km/h and 60 km/h; however, the vehicle was not operating in Autopilot or Full Self-Driving mode.
A further video released by Twitter user Taylor Ogan this week shows a Tesla Model Y failing to stop for a child-sized object, even without FSD or Autopilot activated.
The video reportedly showcases the brand-new Tesla Model Y hitting the mannequin, while a 2022 Lexus RX450h equipped with Luminar's Iris LiDAR system successfully prevents a collision.
A number of Tesla investors and advocates have claimed that the Tesla Vision system is sophisticated enough to detect ‘cardboard’ obstacles on social media. The mannequins are made of foam and fabric.
The electric carmaker has previously faced trouble in Germany, with a Berlin court banning the use of the ‘Autopilot’ term to market semi-autonomous driving functions in 2020.
In addition, the Autopilot system was reportedly blamed for a recent incident in Melbourne involving an alleged hit and run. The incident was said to have been the first time Victorian major collision investigators had been confronted with a case involving assisted driving technology.
COMMENTS