Plato Data Intelligence.
Vertical Search & Ai.

Tesla Full Self-Driving fails to notice child-sized objects in testing

Date:

The latest version of Tesla’s Full Self-Driving (FSD) beta has a bit of a kink: it doesn’t appear to notice child-sized objects in its path, according to a campaign group.

In tests performed by The Dawn Project using a Tesla Model 3 equipped with FSD version 10.12.2 (the latest, released June 1), the vehicle was given 120 yards (110 meters) of straight track between two rows of cones with a child-sized mannequin at the end.

The group says the “test driver’s hands were never on the wheel.” Crucially, Tesla says even FSD is not a fully autonomous system: it’s a super-cruise-control program with various features, such as auto lane changing and automated steering. You’re supposed to keep your hands on the wheel and be able to take over at any time.

Traveling at approximately 25mph (about 40kph), the Tesla hit the dummy each time

Of the results, the Dawn Project said 100 yards of distance is more than enough for a human driver to notice a child, stating: “Tesla’s Full Self-Driving software fails this simple and safety critical test repeatedly, with potentially deadly results.”

“Elon Musk says Tesla’s Full Self-Driving software is ‘amazing.’ It’s not… This is the worst commercial software I’ve ever seen,” said the project’s founder, Dan O’Dowd, in a video he tweeted out along with the results.

O’Dowd, who also founded Green Hills Software in 1982 and advocates for software safety, has been an opponent of Tesla for some time, even launching a bid for US Senate in California that centered on policing Tesla as a way to talk about broader cybersecurity issues. O’Dowd’s Senate bid ended in June when he lost the Democratic party primary. 

The Dawn Project’s stated goal is “making computers safe for humanity.” Tesla FSD is the Project’s first campaign. 

Small sample size

It’s worth noting that The Dawn Project’s tests of FSD 10.12.2, which took place on June 21 in Rosamond, CA, only consisted of three runs. That’s a small sample size, but considering other Tesla tests and statistics it’s not unexpected. 

Malfunctions in Autopilot – Tesla’s suite of software that includes regular Autopilot as well as FSD – have been cited as allegedly being a factor in several, fatal accidents involving both drivers and pedestrians over the years. Last year Tesla rolled back FSD software releases after software bugs were discovered that caused troubles with left turns, something Tesla is still working on

In early June, the US National Highway Traffic Safety Administration upgraded a probe of Tesla Autopilot after it found reasons to look into whether “Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks.” The investigation is ongoing.

A week after announcing its probe, the NHTSA said regular Tesla Autopilot (operating at level 2, not FSD) was involved in 270 of the 394 driver-assist accidents – around 70 percent – it cataloged as part of an investigation into the safety of driver assist technology.

Most recently, the California Department of Motor Vehicles filed complaints against Tesla alleging the biz misrepresented claims the vehicles can drive autonomously. If Tesla doesn’t respond to the DMV’s claims by the end of this week, the case will be settled by default and could lead to the automaker losing its license to sell cars in California.

The Dawn Project said that the NHTSA has acted quickly to issue recalls on Tesla features, pointing to Tesla’s NHTSA-spurred recalls of FSD code that allowed Teslas to roll past stop signs and the disabling of Tesla Boombox

The Dawn Project says its research “is far more serious and urgent.” ®

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?