• Tesla's autonomous driver technology does not detect children on the road, according to tests.
    Tesla's autonomous driver technology does not detect children on the road, according to tests.
    Professional test driver using Tesla’s Full Self-Driving mode repeatedly hit a child-sized mannequin in its path

    Professional test driver using Tesla’s Full Self-Driving mode repeatedly hit a child-sized mannequin in its path

    A Tesla Model 3 fitted with a full self-driving system.A Tesla Model 3 fitted with a full self-driving system. Photograph: Sjoerd van der Wal/Getty ImagesA Tesla Model 3 fitted with a full self-driving system. Photograph: Sjoerd van der Wal/Getty ImagesEdward HelmoreTue 9 Aug 2022 13.08 EDTLast modified on Tue 9 Aug 2022 17.51 EDT

    A safe-technology advocacy group issued claimed on Tuesday that Tesla’s full self-driving software represents a potentially lethal threat to child pedestrians, the latest in a series of claims and investigations into the technology to hit the world’s leading electric carmaker.

    According to a safety test conducted by the Dawn Project, the latest version of Tesla Full Self-Driving (FSD) Beta software repeatedly hit a stationary, child-sized mannequin in its path. The claims that the technology apparently has trouble recognizing children form part of an ad campaign urging the public to pressure Congress to ban Tesla’s auto-driving technology.

    In several tests, a professional test driver found that the software – released in June – failed to detect the child-sized figure at an average speed of 25mph and the car then hit the mannequin. The Dawn Project’s founder, Dan O’Dowd, called the results “deeply disturbing”.

    Company chief “Elon Musk says Tesla’s Full Self-Driving software is ‘amazing’,” O’Dowd added. “It’s not. It’s a lethal threat to all Americans.

    “Over 100,000 Tesla drivers are already using the car’s Full Self-Driving mode on public roads, putting children at great risk in communities across the country.”

    O’Dowd argued that the test results show the need to prohibit self-driving cars until Tesla proves the vehicles “will not mow down children in crosswalks”.

    Tesla has repeatedly hit back at claims that its self-driving technology is too underdeveloped to guarantee the safety of either the car’s occupants or other road users.

    O’Dowd has drawn accusations that he is little more than a competitor to Tesla because his company bills itself as an expert in making particular software used in automated driving systems. O’Dowd insists his Green Hills software doesn’t compete with Tesla, saying it doesn’t make self-driving cars. But he has acknowledged some car companies use his company’s software for certain components.

    After a fiery crash in Texas in 2021 that killed two, Musk tweeted that the autopilot feature, a less sophisticated version of FSD, was not switched on at the moment of collision.

    At the company’s shareholder meeting earlier this month Musk said that Full Self-Driving has greatly improved, and he expected to make the software available by the end of the year to all owners that request it. But questions about its safety continue to mount.

    In June, the National Highway Traffic Safety Administration (NHTSA), said it was expanding an investigation into 830,000 Tesla cars across all four current model lines. The expansion came after analysis of a number of accidents revealed patterns in the car’s performance and driver behavior.

    The NHTSA said the widened investigation would aim to examine the degree to which Tesla’s autopilot technology and associated systems “may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision”.

    A second NHTSA investigation is also under way to determine if the removal of the forward-looking radar sensor on some newer Teslas is causing the vehicles to apply their brakes for no reason, which is called “phantom braking” and can lead to wrecks.

    Since 2016, the agency has investigated 30 crashes involving Teslas equipped with automated driving systems, 19 of them fatal. NHTSA’s Office of Defects Investigation is also looking at the company’s autopilot technology in at least 11 crashes where Teslas hit emergency vehicles.

    Many such wrecks aren’t investigated by the NHTSA. And in nearly 400 crashes involving cars with driver-assist systems reported by automakers between July 2021 and this past May, more Teslas were involved than all other manufacturers combined.

    What's your reaction?

    Comments

    https://nexth.city/assets/images/user-avatar-s.jpg

    0 comment

    Write the first comment for this!