Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • @[email protected]
    link
    fedilink
    English
    3811 months ago

    Didn’t, or couldn’t? Tesla uses a vastly inferior technology to run their “automated” driving protocols. It’s a hardware problem first and foremost.

    It’s like trying to drive a car with a 720p resolution camera mounted on the license plate holder versus a 4k monitor on the top of the car. That’s not a perfect analogy, but it’s close enough for those not aware of how cheap these cars and their tech really is.

    • @[email protected]
      link
      fedilink
      English
      1911 months ago

      It remains to be seen what hardware is required for autonomous driving as no company has a fully functioning system, so there is no baseline to compare to. Cruise (the “4k monitor” in your anaology) just had to cut their fleet of geofenced vehicles after back to back crashes involving emergency vehicles along with blocking traffic and attempting to run over things like fire hoses.

  • @[email protected]
    link
    fedilink
    English
    1311 months ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

    • Ocelot
      link
      fedilink
      English
      911 months ago

      Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.

      • @[email protected]
        link
        fedilink
        English
        1011 months ago

        When a human is found to be at fault, you can punish them.

        With automated driving, who’s to punish? The company? Great. They pay a small fine and keep making millions while your loved one is gone and you get no justice.

        • @[email protected]
          link
          fedilink
          English
          811 months ago

          People generally aren’t punished for an accident unless they did it intentionally or negligently. The better and more prevalent these systems get, the fewer the families with lost loved ones. Are you really arguing that this is a bad thing because it isn’t absolutely perfect and you can’t take vengeance on it?

          • @[email protected]
            link
            fedilink
            English
            111 months ago

            Generally, people are punished for causing an accident, purposefully or not. Their insurance will either raise their rates or drop them causing them to not be able to drive. That is a form of punishment you don’t get with automated driving.

            • @[email protected]
              link
              fedilink
              English
              311 months ago

              Of course you get the same with automated driving. Accidents will cause either the insurance rate of the whole company to raise, or the company will have to pay out of pocket. In both cases accidents have direct financial “punishment” and if a car company is seen to be “unsafe” (see cruise right now) they are not allowed to drive (or drive “less”). I don’t see a big difference to normal people. After a while this is is my opinion even better, because “safer” companies will push out “less safe” companies… Assuming of course that the gov properly regulates that stuff so that a minimum of safety is required.

            • @[email protected]
              link
              fedilink
              English
              111 months ago

              Increased rates aren’t a punishment they’re a risk calculation and insurance (outside of maybe property insurance in case a tree falls on the car for example) may not even be needed someday if everything is handled automatically without driver input. Why are you so stuck on the punishment aspect when these systems are already preventing needless death?

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          10 months ago

          punish and justice are synonymous… edit WOW bad typo should have read punish and justice are NOT synonymous.

    • @[email protected]
      link
      fedilink
      English
      511 months ago

      I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.

    • Liz
      link
      fedilink
      English
      811 months ago

      At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.

  • @[email protected]
    link
    fedilink
    English
    1211 months ago

    I am the last person to defend Elon and his company but honestly it’s user error. It’s like blaming Microsoft for deliberately ignoring logic and downloading viruses. The autopilot should be called driver assist and that people still need to pay attention. Deaths were caused by user negligence.

    • @[email protected]
      link
      fedilink
      English
      211 months ago

      Then you should call it driver asist, not autopilot.

      Also Tesla’s advertisement is based on “having solved self driving”.

    • q47tx
      link
      fedilink
      English
      211 months ago

      Exactly. Until the next maybe 2 decades, it should only be called driver assist.

    • Ocelot
      link
      fedilink
      English
      411 months ago

      That is precisely why autopilot is called a driver assist system. Just like every other manufacturer’s LKAS.

        • Ocelot
          link
          fedilink
          English
          311 months ago

          How is that confusing? If you look at the capabilities an airplane autopilot does, it will maintain altitude and heading and make turns at pre-determined points. Autopilot in an airplane does absolutely zero to avoid other airplanes or obstacles, and no airplane is equipped with any AP system that allows the pilot to leave the cockpit.

          Tesla autopilot maintains speed and distance from the car in front of you and keeps you in your lane. Nothing else. It is a perfect name for the system.

          • Flying Squid
            link
            fedilink
            English
            211 months ago

            And you think the average person knows the capabilities of an airplane autopilot?

            • @[email protected]
              link
              fedilink
              English
              211 months ago

              I’d hope they would before willfully getting behind the controls of one to operate it. Regardless of what we call it, these people still would have crashed. They both drove into the side of a semi while sitting in the driver’s seat with their hands on the wheel.

              • @[email protected]
                link
                fedilink
                English
                2
                edit-2
                11 months ago

                That because tesla induced them to think it was level 4 or 5 while FSD is level 2 (like most Toyotas are ) but with a few extra options.

                And as long as there is a need for a human to endorse responsability, it will remain at level 2.

                • @[email protected]
                  link
                  fedilink
                  English
                  111 months ago

                  A) Autopilot and the FSD beta are two totally separate systems and FSD wasn’t even available as an option when one of these crashes occurred.

                  B) where’s the evidence that these drivers believed they were operating a level 4 or 5 system?

  • @[email protected]
    link
    fedilink
    English
    411 months ago

    Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.

    • Ocelot
      link
      fedilink
      English
      711 months ago

      Tesla Autopilot has nothing to do with AI. It is a lane keep assist system with cruise control.

      • @[email protected]
        link
        fedilink
        English
        411 months ago

        It has a lot to do with AI. Their systems use a lot of deep learning etc to recognize agents/obstacles on the road (perception), to infer how the agents will move in the future (prediction), and to generate trajectories for their car (motion planning). It definitely isn’t Artificial General Intelligence, but it is most certainly AI.

        • Ocelot
          link
          fedilink
          English
          111 months ago

          You are referring to FSD. Not Autopilot.

      • @[email protected]
        link
        fedilink
        English
        811 months ago

        Which uses computer vision, which is a form of AI. It doesn’t have to be complex, or even work well, to be considered AI. All you need is a computer that makes decisions based on dynamic inputs and a set of rules on how to handle them.

      • @[email protected]
        link
        fedilink
        English
        1011 months ago

        Which is why the name Autopilot is (very dangerous) false advertising.

        Also lets not forget the “your Tesla will be an autonomous Robotaxi” Bullshit.

  • golamas1999
    link
    fedilink
    English
    811 months ago

    Full Self Driving is such a sim name. The feature is level 2 advanced cruise control.

  • @skymtf
    link
    fedilink
    English
    2211 months ago

    I feel like some people are such Tesla fanboys that they will argue when I say Tesla FSD is not real and never has been.

    • Ocelot
      link
      fedilink
      English
      711 months ago

      I have nearly 20k miles on tesla’s FSD platform, it works amazingly well for something thats “not real”. There are countless youtube channels out there where people will mount a gopro in their car and go for a drive. Some of them like AIDRIVR and Whole Mars Catalog pretty much never take over control of the car without any drama. Especially in the past ~6 months or so of development it has been amazing.

  • @[email protected]
    link
    fedilink
    English
    1
    edit-2
    11 months ago

    There’s like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.

    • @[email protected]
      link
      fedilink
      English
      211 months ago

      Because the tech has inarguably failed. It’s all about the lawyers and how long they can extend Tesla’s irresponsibility.

      • @[email protected]
        link
        fedilink
        English
        211 months ago

        See, I would much rather have this discussion vs another one about advertising and names.

        We’re seeing progress. Ford is expanding features on Blue Cruise (in-lane avoidance maneuvers I believe). I think Mercedes is expanding the area theirs works in. Tesla added off-highway navigation in the last year.

        No one’s reached full autonomy for more than a few minutes or a few miles, but I wouldn’t say there’s no argument there. In fact, I’d say they’re arguably making visible progress.

  • harold
    link
    fedilink
    English
    24
    edit-2
    11 months ago

    but im saving the planet and making sure elon gets a cut of my money

  • PatFusty
    link
    fedilink
    English
    1311 months ago

    The driver was also not even paying attention to the road so the blame should be on him not the car. People need to learn that Tesla’s version of autopilot has a specific use case and regular streets is not that.

    • @[email protected]
      link
      fedilink
      English
      811 months ago

      I agree with you. This is about user error. Unfortunately, people seemed to believe their cars could literally drive themselves.

    • @[email protected]
      link
      fedilink
      English
      111 months ago

      Tesla is producing advertisement videos that say ‘our cars drive themselves, you don’t need a driver’.

    • @[email protected]
      link
      fedilink
      English
      21
      edit-2
      11 months ago

      They should really have called it something else than autopilot/FSD. It’s a driver assist and everyone knows this but it’s so easy to dunk on them when it fails because of the name though I’m 100% sure the system tells you to keep your eyes on the road and hands on the steering wheel when you engage it. If you crash a “self driving” car it’s the fault of the driver - not the vehicle.

  • @[email protected]
    link
    fedilink
    English
    2311 months ago

    I do agree the name and Teslas general advertising of drivers assists are a bit misleading.

    But this is really on the driver for not paying attention.

    • @[email protected]
      link
      fedilink
      English
      5311 months ago

      “A bit misleading” is, I think, a bit of a misleading way to describe their marketing. It’s literally called Autopilot, and their marketing material has very aggressively pitched it as a ‘full self driving’ feature since the beginning, even without mentioning Musk’s own constant and ridiculous hyperbole when advertising it. It’s software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers’ safety.

      • @[email protected]
        link
        fedilink
        English
        1211 months ago

        Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

        • Einar
          link
          fedilink
          English
          24
          edit-2
          11 months ago

          So I need to understand the autopilot of a plane first before I buy a car?

          I would be mislead then, as I have no idea how such autopilots work. I also suspect that those two systems don’t really work the same. One flies, the other drives. One has traffic lights, the other doesn’t. One is operated by well paid professionals, the other, well, by me. Call me simple, but there seem to be some major differences.

          • @[email protected]
            link
            fedilink
            English
            711 months ago

            Yeah, there are some major differences in the vehicles, but both disengage when there’s anything out of the ordinary going on. Maybe people base their understanding of autopilots on the movie “Airplane!” where that inflatable puppet groped the Stewardess afterwards.

              • Ocelot
                link
                fedilink
                English
                111 months ago

                I’m sorry, what? If you set an airplane to maintain altitude and heading with autopilot, it will 100% fly you into the side of a mountain if there’s one in front of you.

              • @[email protected]
                link
                fedilink
                English
                2
                edit-2
                11 months ago

                True, good point. As far as I know, it does turn itself off if it detects something it can’t handle, though. The problem with cross traffic is that it obviously can’t detect it, otherwise turning itself off would already be a way of handling it.

                Proximity detection is far easier up in the air, especially if you’re not bound by the weird requirement to only use visible spectrum cameras.

                (To make things clear, I’m just defending the engineers there who had to work within these constraints. All of this is a pure management failure.)

          • @[email protected]
            link
            fedilink
            English
            211 months ago

            This is a pretty absurd argument. You could apply this to literally any facet of driving.

            “I have to learn what each color of a traffic light means before driving?”

            “I have to learn what white and yellow paint means and dashes versus lines? This is too confusing”

            God help you when you get to 4-way stops and roundabouts.

            • Einar
              link
              fedilink
              English
              2
              edit-2
              11 months ago

              Not absurd, but reality. We do that in driving school.

              I don’t know where you are from and which teaching laws apply, of course, but I definitely learned all those lessons you mentioned.

              • @[email protected]
                link
                fedilink
                English
                111 months ago

                That’s precisely my argument and why “learning my new car’s features is too confusing” is an absurd argument.

          • @[email protected]
            link
            fedilink
            English
            111 months ago

            I would have though people would read autopilot and think automatic. At least that’s what I do. I guess pilot is closely associated with planes but it certainly isn’t what I think of.

        • r00ty
          link
          fedilink
          1911 months ago

          They’re not buying a plane though. They’re buying a car with an autopilot that is labeled as “full self driving”. That term does imply it will handle a complete route from A to B.

          People are wrongly buying into the marketing hype and that is causing crashes.

          I’m very concerned about some of the things I’ve seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it’s OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

          They should not be allowed to market the feature this way and I don’t think it should be openly available to normal users as it is now. It’s just too dangerous to put in the hands (or not) of normal drivers.

          • Ocelot
            link
            fedilink
            English
            5
            edit-2
            11 months ago

            Autopilot has never been “Full Self Driving”. FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as “Pro Pilot Assist” or “Honda Sensing” or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

          • @[email protected]
            link
            fedilink
            English
            511 months ago

            I’ve never sat in a Tesla, so I’m not really sure, but based on the things I’ve read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn’t be any confusion about this.

            • r00ty
              link
              fedilink
              111 months ago

              Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

              I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

              • Ocelot
                link
                fedilink
                English
                1
                edit-2
                11 months ago

                Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.

              • @[email protected]
                link
                fedilink
                English
                211 months ago

                If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

                I do not trust normal users to understand what beta means

                Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

                When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

            • Miqo
              link
              fedilink
              English
              511 months ago

              I’ve never sat in a Tesla, so I’m not really sure

              There shouldn’t be any confusion about this.

              U wot m8?

        • @[email protected]
          link
          fedilink
          English
          1111 months ago

          Why do you think companies need to warn about stuff like “Caution, Contents are hot” on paper coffee shops? People are stupid.

          • @[email protected]
            link
            fedilink
            English
            311 months ago

            Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

            If the courts would apply a reasonable level of common sense, they wouldn’t exist.

      • @[email protected]
        link
        fedilink
        English
        1011 months ago

        You don’t even seem to get the terms right so makes me question how well informed you really are on the subject.

        Autopilot is the most basic free driver assist version that comes with every Tesla. Then there’s Enhanced Autopilot which costs extra and is more advanced and lastly there’s Full Self Driving BETA. Even the name indicates you probably shouldn’t trust your life with it.

    • @[email protected]
      link
      fedilink
      English
      1311 months ago

      Back on 2016 Tesla released a video that says “our cars drive themselves, you don’t the driver”.

      • Ocelot
        link
        fedilink
        English
        111 months ago

        Is that like “Man door hand hook car door”? What?

  • @[email protected]
    link
    fedilink
    English
    411 months ago

    It’s time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn’t. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it’s time to put a stop to it.

    • NιƙƙιDιɱҽʂ
      link
      fedilink
      English
      5
      edit-2
      11 months ago

      The article isn’t talking about FSD, these accidents are from 2019 and 2016 before public availability of FSD. Of course, “Full Self Driving” ain’t great either…

      The whole article is kind of FUD. It’s saying engineers didn’t “fix” the issue, when the issue is people are using Autopilot, essentially advanced lane keep, on roads it shouldn’t be used on. It doesn’t give a shit about intersections, stop signs, or stop lights. It just keeps you in your lane and prevents you from rear ending someone. That’s it. It’s a super useful tool in it’s element, but shouldn’t be used outside of freeways or very simple roads at reasonable speeds. That said, it also shouldn’t be fucking called “autopilot”. That’s purely marketing and it’s extremely dangerous, as we can see.

  • dub
    link
    fedilink
    English
    3711 months ago

    A times B times C equals X… I am jacks something something something

      • @[email protected]
        link
        fedilink
        English
        411 months ago

        All of the major ones. On the other hand, the Pinto’s gas tank exploded less ofteb than competing models in the era, and wasn’t the only design with the lowered gas tank.

        Look up the You’re Wrong about podcast on the Ford Pinto which is a great deep dive on car development. and product investigative reporting.

    • @[email protected]
      link
      fedilink
      English
      3111 months ago

      A times B times C equals X… I am jacks something something something

      Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.

      Woman on Plane: Are there a lot of these kinds of accidents?

      Narrator: You wouldn’t believe.

      Woman on Plane: Which car company do you work for?

      Narrator: A major one.

      • @[email protected]
        link
        fedilink
        English
        1211 months ago

        When you’re selling a million cars, it’s guaranteed that some of them will have a missed defect, no matter how good your QC is.

        That’s why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.

  • @[email protected]
    link
    fedilink
    English
    8
    edit-2
    11 months ago

    Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.

    I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.

    At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.