Safety tests show Tesla vehicles ‘repeatedly’ fail to detect children
Studies show Tesla’s ‘Full Self-Driving’ (FSD) Beta technology fails to recognize children, further building on concerns about its safety as the company makes it available to more users.
First, a safety test conducted by the ‘Dawn Project‘ (via The Guardian) found that a Tesla Model 3 with FSD “repeatedly struck [a] child mannequin in a manner that would be fatal to an actual child.” Dawn Project seeks to improve the safety and reliability of software by stopping the use of commercial-grade software in safety-critical systems.
Further, investor Taylor Ogan shared a short video on Twitter showing a comparison between a Tesla and a vehicle equipped with LiDAR tech from Luminar — in the video, the Tesla hits the child mannequin while the LiDAR-equipped car manages to stop. In follow-up tweets, Ogan criticizes Tesla for not adopting LiDAR technology for its autonomous vehicle software.
It’s 2022, and Teslas still aren’t stopping for children. pic.twitter.com/GGBh6sAYZS
— Taylor Ogan (@TaylorOgan) August 9, 2022
LiDAR, for those unfamiliar with the term, refers to light detection and ranging or laser imaging, detection, and ranging. The tech allows for determining the range between two things by bouncing a laser off one object and measuring the time it takes for the laser to return.
The Dawn Project test will form part of an advertising campaign intended to encourage U.S. Congress to ban Tesla’s FSD.
Our new safety test of @ElonMusk’s Full Self-Driving Teslas discovered that they will indiscriminately mow down children.
Today @RealDawnProject launches a nationwide TV ad campaign demanding @NHTSAgov ban Full Self-Driving until @ElonMusk proves it won’t mow down children. pic.twitter.com/i5Jtb38GjH
— Dan O’Dowd (@RealDanODowd) August 9, 2022
Tesla and CEO Elon Musk have so far disputed concerns over the safety of FSD. At the same time, the U.S. National Highway Traffic Safety Administration (NHTSA) has launched investigations and requested information from the company about FSD. It’s worth noting that Tesla has also recently made FSD available in Canada.
A common line of defence appears to be claiming that FSD still requires driver assistance and is not fully autonomous. And while Tesla does state this on its website, the name — Full Self-Driving — suggests otherwise. Moreover, Tesla made the software available to thousands of Tesla owners to use on public roads, many of whom have misused FSD. Tesla has also delayed or pulled FSD updates over bugs and other issues several times and even fired an employee who shared a video of flaws with the FSD system.
There are clear safety concerns at play here, and critics have highlighted these concerns in an attempt to get governments to regulate the use of autonomous driving systems on public roads until the systems are safer and more reliable. Tesla fans have responded by attacking these critics online, with one Twitter user going so far as to request a child volunteer to run in front of their FSD-equipped Tesla to prove it will stop.
me: the government should better regulate unproven autonomous cars to protect people
Tesla fan: let’s see if a self-driving Tesla will hit a kid pic.twitter.com/7LHZXY0uG8— Jacob Silverman (@SilvermanJacob) August 10, 2022
“I promise I won’t run them over,” the person wrote. Yea, sure bud.
Source: Dawn Project, Taylor Ogan (Twitter) Via: The Guardian
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.