Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • r00ty
    link
    fedilink
    191 year ago

    They’re not buying a plane though. They’re buying a car with an autopilot that is labeled as “full self driving”. That term does imply it will handle a complete route from A to B.

    People are wrongly buying into the marketing hype and that is causing crashes.

    I’m very concerned about some of the things I’ve seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it’s OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

    They should not be allowed to market the feature this way and I don’t think it should be openly available to normal users as it is now. It’s just too dangerous to put in the hands (or not) of normal drivers.

    • Ocelot
      link
      fedilink
      English
      5
      edit-2
      1 year ago

      Autopilot has never been “Full Self Driving”. FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as “Pro Pilot Assist” or “Honda Sensing” or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

    • @[email protected]
      link
      fedilink
      English
      51 year ago

      I’ve never sat in a Tesla, so I’m not really sure, but based on the things I’ve read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn’t be any confusion about this.

      • Miqo
        link
        fedilink
        English
        51 year ago

        I’ve never sat in a Tesla, so I’m not really sure

        There shouldn’t be any confusion about this.

        U wot m8?

      • r00ty
        link
        fedilink
        11 year ago

        Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

        I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

        • @[email protected]
          link
          fedilink
          English
          21 year ago

          If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

          I do not trust normal users to understand what beta means

          Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

          When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

        • Ocelot
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.