• @[email protected]
    link
    fedilink
    29315 days ago

    Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

    • @[email protected]
      link
      fedilink
      English
      27
      edit-2
      15 days ago

      Holy shit I did indeed look it up, and it’s true. Dunno if it’ll hold up but it’s still shady as shit

      • @[email protected]
        link
        fedilink
        English
        1515 days ago

        Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you’re not controlling it at that moment, you are expected to maintain safe operation.

        That’s why the Uber self driving car that killed someone was considered the test driver’s fault and left Uber mostly off the hook.

        Not sure how it works for the robo taxis, though.

        • @[email protected]
          link
          fedilink
          615 days ago

          Yeah that’s gonna be tricky with those. I live in Vegas where they’re already operating. No steering wheel at all.

        • @[email protected]
          link
          fedilink
          115 days ago

          Well… What about blaming the passengers?

          Now, I would like to imagine the legal case of an accident involving a self driving robo-taxi transporting another robot to a facility (owned by the company).

          Maybe they can blame the humans who suffered the accident?

        • @[email protected]
          link
          fedilink
          English
          315 days ago

          I don’t know the specifics of how the law is implemented but the self driving levels are defined such that from SAE level 3 onward you may have a case against manufacturer. I haven’t kept up to date with Tesla SAE level but I imagine they’re still officially on level 2 because it lets them keep their hands clean.

    • TrackinDaKraken
      link
      fedilink
      English
      9715 days ago

      I didn’t know this, but I’m not shocked, or even a little bit surprised.

      • @[email protected]
        link
        fedilink
        English
        4915 days ago

        Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

        • @[email protected]
          link
          fedilink
          4115 days ago

          This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was “off”

      • @[email protected]
        link
        fedilink
        115 days ago

        But tesla doesn’t claim you can ever not overlook the car, so if you didn’t notice and stop it, it is your fault. Fuck elon and all that, but it is somewhat reasonable.

    • @[email protected]
      link
      fedilink
      514 days ago

      The driver is always at blame, even if it was on. They turn it off for marketing claims.

      PS: fuck elon