TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • jonne@infosec.pub
    link
    fedilink
    English
    arrow-up
    66
    ·
    2 days ago
    1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      64
      ·
      2 days ago

      … Also accurate.

      God, it really is a nut punch. The system detects the crash is imminent.

      Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        37
        ·
        edit-2
        2 days ago

        Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

          • KayLeadfoot@fedia.ioOP
            link
            fedilink
            arrow-up
            23
            ·
            2 days ago

            NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

            The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 day ago

              So to drive with FSD is 8x safer than your average human driver.

              WITH a supervising human.

              Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

              Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            2 days ago

            If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 days ago

      Even when it is just milliseconds before the crash, the computer turns itself off.

      Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.