Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • r00ty
    link
    fedilink
    11 year ago

    Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

    I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

      I do not trust normal users to understand what beta means

      Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

      When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

    • Ocelot
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.