‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr…::undefined

  • Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    6
    ·
    5 months ago

    When I was looking to buy a new car back in early 2019, I walked into a showroom for a final test drive before I threw some money down for a Model 3.

    It started to rain pretty hard on the return drive back. When executing an auto lane change, the sensors freaked out because of the water interference and they violently yanked the car back into the origin lane halfway through the lane change. It hydroplaned a hair and scared this shit out of my wife and I. The Telsa employee assured us “it’s ok, this is normal.” Hearing that was normal was not comforting.

    Upon returning to the showroom, a different model 3 in the parking lot started backing toward a small child. My wife saw what was happening, threw herself in front of the car, and that caused it to halt.

    I’m sure the software has progressed in the past 5 years, but suffice to say, we changed our minds on the car at that time. Those two incidents within 15 minutes really made us question how that shit was legal.

    • Draupnir@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      4
      ·
      edit-2
      5 months ago

      If the car was backing out, that was a human driver in control, not autopilot. Autopilot can only be enabled while driving on a well-marked roadway. The first part is plausible however. Likely the software at the time could not handle rain appropriately and you are absolutely right to question this if they tell you it was normal.

      • Ghostalmedia@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        The car was being summoned from a parking space. Summon / Smart Summon will absolutely back out of a space fully autonomously.

    • Fisch@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      That’s the thing, it’s only legal in the US (as far as I know, at least). In Germany you’re only allowed to use self-driving if your hands are on the steering wheel at all times and you can take over if something goes wrong.

      • c0m47053@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 months ago

        I’m pretty sure that is also the case in the US. These incidents are either caused by some sort of defeat device (I have seen weights that wrap around the steering wheel, no idea if they work), or people who have just gotten good at resting a hand on the wheel and not paying attention I think

        • Fisch@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          5 months ago

          I thought Tesla just added that by choice and not because it’s required by law

      • eltrain123@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 months ago

        That’s the case in the US, too. The car automatically shuts off autopilot after 3 warnings of not keeping your hands on the steering wheel. It produces a loud audible alert after a few seconds if it senses the driver isn’t keeping their hands on the wheel. After the 3rd time, it continues the audible alert until the driver takes back control.

        There are also several warnings about keeping your hands on the wheel and staying alert when engaging autopilot.

        The people saying otherwise are either ignorant or disingenuous.

    • JasSmith@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      5 months ago

      These instances of errors are obviously alarming, but all the evidence we have is that they’re still safer than human drivers. They will make mistakes - and sometimes those mistakes will cost lives - but they will make fewer mistakes than humans. Given this, as visceral as it feels when we hear of these stories, I think our ire is misplaced. Automated driving will never be perfect. If that’s the bar we’re aiming for we should just give up and go home. The goal is better than humans, and in many conditions, it’s already there.