A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

  • @[email protected]
    link
    fedilink
    English
    112 months ago

    Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.

  • Goten
    link
    fedilink
    English
    262 months ago

    self driving is the future, but im glad im not a beta tester.

    • KayLeadfootOP
      link
      fedilink
      162 months ago

      You’re probably right about the future, but like damn, I wish they would slow their roll and use LiDAR

      • FaceDeer
        link
        fedilink
        142 months ago

        Elon Musk decided they absolutely would not use lidar, years ago when lidar was expensive enough that a decision like that made economic sense to at least try making work. Nowadays lidar is a lot cheaper but for whatever reason Musk has drawn a line in the sand and refuses to back down on it.

        Unlike many people online these days I don’t believe that Musk is some kind of sheer-luck bought-his-way-into-success grifter, he has been genuinely involved in many of the decisions that made his companies grow. But this is one of the downsides of that (Cybertruck is another). He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked. He seems to be increasingly having trouble distinguishing them.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          2 months ago

          Musk has drawn a line in the sand and refuses to back down on it.

          From what I heard the upcoming Tesla robotaxi test cars based on model Y are supposed to have LIDAR. But it’s ONLY the robotaxi version that has it.

          He seems to be increasingly having trouble distinguishing them.

          Absolutely, seems to me he has been delusional for years, and it’s getting worse.

        • @[email protected]
          link
          fedilink
          English
          222 months ago

          He really hasn’t. He purchased companies that were already sitting on profitable ideas. He is not an engineer. He is not a scientist. He has no training in any design discipline. He takes credit for the ideas of people he pays. He takes credit for the previous achievements of companies he’s purchased.

          What is it going to fucking take for people to finally actually see the grifter for what he is? He’s never had a single good fucking r&d idea in his life 🙃 he has wasted billions of dollars researching and developing absolutely useless ideas that have benefited literally no one and have not made him any money. It is absolutely incredible how powerful his mythos is, that people still believe him to be or have been some kind of engineer or something. He’s a fucking racist nepo baby. He’s never done a single useful thing in his life. He wasn’t the sole individual involved in creating PayPal (and was entirely unrelated in turning it into the successful business it became), he didnt found tesla nor is he responsible for any of the technological developments it made (except for forcing his shitty charger design that notoriously breaks down and charges at half the speed that competitors do), he did not found SpaceX and by all metrics involved has been loathed by everyone at the company for the past decade for continuously committing workers rights violations and fostering a racist sexist and ableist work environment. The man has done nothing but waste people’s time stoking his ego and sexually abusing a slew of employees for the past 2 and a half decades.

        • Echo Dot
          link
          fedilink
          English
          112 months ago

          He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked.

          He’s utterly incapable of admitting that one of his ideas is garbage.

          There is a reason he fawns all over Trump and that’s because both of them are of a type. Both of them have egos large enough to have their own gravitational fields but lack any real talent. Look his family up, they’re all like that.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      Self driving via cameras IS NOT THE FUTURE!! Cameras are basically slightly better human eyes and human eyes suck ass.

    • @[email protected]
      link
      fedilink
      English
      112 months ago

      It’s fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don’t stop for pedestrians or drive off a cliff. So freaking what, that’s the price for progress my friend!

      I’d like to think this is unnecessary but just in case here’s a /s for y’all.

    • KayLeadfootOP
      link
      fedilink
      62 months ago

      GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless

  • @[email protected]
    link
    fedilink
    English
    472 months ago

    The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

    What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

      • @[email protected]
        link
        fedilink
        English
        8
        edit-2
        2 months ago

        For many years the “supervised” was not included, AFAIK Tesla was forced to do that.
        And in this case “supervised” isn’t even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.

          • @[email protected]
            link
            fedilink
            English
            32 months ago

            The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            2 months ago

            No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are “unsure” how to handle.

            But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              2 months ago

              They can’t stop and ask for assistance at 100km/h on a highway.

              I hope Tesla/Musk address this accident and get the telemetry from the car, cause there’s no evidence that FSD was even on.

              • @[email protected]
                link
                fedilink
                English
                12 months ago

                According to the driver it was on FSD, and it was using the latest software update available.

                https://www.reddit.com/user/SynNightmare/

                They can’t stop and ask for assistance at 100km/h on a highway.

                Maybe the point is then, that Tesla FSD shouldn’t be legally used on a highway.
                But it probably shouldn’t be used anywhere, because it’s faulty as shit.
                And why can’t is slow down to let the driver take over in a timely manner, when it can break for no reason.
                It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!

                • @[email protected]
                  link
                  fedilink
                  English
                  1
                  edit-2
                  2 months ago

                  According to the driver, with zero evidence backing up the claim. With how much of a hard on everyone has for blaming Elon musk for everything, and trying to drag teslas stock down, his accident is a sure fire way to thousands of Internet karma and e-fame on sites like Reddit and Lemmy. Why doesn’t he just show us the interior camera?

                  Looking at his profile he’s milking this for all it’s worth - he’s posted the same thread to like 8 different subs lol. He’s karma whoring. He probably wasn’t even the one involved in the crash.

                  Looked at his twitter which he promoted on there too, and of course he tags mark rober and is retweeting everything about this crash. He’s loving the attention and doing everything he can to get more.

                  Also he had the car for less than 2 weeks and said he used FSD “all the time”……in a brand new car he’d basically never driven……and then it does this catastrophic failure? Yeh nah lol. Also as others in some of the threads have pointed out, the version of FSD he claims it was on wasn’t out at the time of his accident.

                  Dudes lying through his teeth.

    • Echo Dot
      link
      fedilink
      English
      282 months ago

      Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.

      What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

      • @[email protected]
        link
        fedilink
        English
        82 months ago

        What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

        I’ve argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
        Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

      • @[email protected]
        link
        fedilink
        English
        142 months ago

        To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

        • NιƙƙιDιɱҽʂ
          link
          fedilink
          English
          42 months ago

          …It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)

        • KayLeadfootOP
          link
          fedilink
          112 months ago

          Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            2 months ago

            You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

            FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

            • Echo Dot
              link
              fedilink
              English
              82 months ago

              FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

              Yeah people keep bringing that up as a counter arguement but I’m pretty certain humans don’t swerve off a perfectly straight road into a tree all that often.

              So unless you have numbers to suggest that humans are less safe than FSD then you’re being equally obtuse.

              • @[email protected]
                link
                fedilink
                English
                22 months ago

                Humans do swerve off perfectly straight roads into trees, I know because I’ve done it!

                • Echo Dot
                  link
                  fedilink
                  English
                  32 months ago

                  Can you confirm that to the best of your knowledge you are not a robot?

              • @[email protected]
                link
                fedilink
                English
                22 months ago

                A simple google search, (which YOU could have done yourself), shows it’s abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I’m no Teslastan, (I think they are over priced and deliberately for rich people only), but that’s an improvement, a noticeable improvement.

                And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.

        • Echo Dot
          link
          fedilink
          English
          22 months ago

          Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.

          Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.

      • Echo Dot
        link
        fedilink
        English
        62 months ago

        That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

        Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

        It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.

          The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.

          • Echo Dot
            link
            fedilink
            English
            52 months ago

            Your saying this on a video where it drove into a tree and flipped over. There isn’t time for a human to react, that’s like saying we don’t need emergency stops on chainsaws, the operator needs to just not drop it.

        • KayLeadfootOP
          link
          fedilink
          42 months ago

          …is literally never going to get any better it’s always going to be this bad.

          Hey now! That’s unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won’t know which until it tries to kill you in new and unexpected ways :j

    • @[email protected]
      link
      fedilink
      English
      12 months ago

      Took me a second to get it, but that’s brilliant.
      I wonder if there might even be some truth to it?

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        2 months ago

        Wonder no more. Someone did this on YouTube using cardboard boxes, Tesla drove straight through them. Skip to around the 15 minute mark to watch it drive through the “wall” without even touching the brakes.

        Edit: thought the person you were replying to said it thought a wall was a tunnel, not the other way round. Still funny to watch it breeze through a wall with a tunnel painted on it though.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          Yes I know the video, what I was wondering is if it could be true that they tried to make the AI detect a wall with a road painted on it, and it falsely believed there was a wall, and made an evasive maneuver to avoid it.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      I’m gonna answer your question with a question, as I don’t have your answer. When a human wrecks up it’s their fault. Who’s fault is it when something like this happens? Should ut still be the person in the driver’s seat?

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        No idea how that will turn out. Fully auto, but you have to maintain the vehicle and ensure it’s road worthy as the owner.

    • @[email protected]
      link
      fedilink
      English
      72 months ago

      A good point, but I’m not sure that’s where the bar is. How does it compare to other self-driving systems that have lidar, for instance?

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        Depends on the issue at hand. To get these approved and widespread, better than humans may be the bar.

    • @[email protected]
      link
      fedilink
      English
      72 months ago

      Look, I respect where you’re coming from. May I presume your line of reasoning is in the vein of “elon musk sucks and thus anyone who buys their stuff is a Nazi and should die” - but that is far, far too loose of a chain of logic to justify sending a man to death alone. Perhaps if you said that they should be held accountable with the death penalty on the table? But c’mon - are you really the callous monster your comment paints you as?

      • @[email protected]
        link
        fedilink
        English
        42 months ago

        These aren’t passive victims, they are operating harmfully dangerous machines at high speeds on roads shared with the rest of us.

        • Echo Dot
          link
          fedilink
          English
          32 months ago

          Right but they believe that the car is safe, because of the advertising and because the product is legally sold.

          If anyone is to blame here it’s not the owner of the car, it’s the regulators who allow such a dangerous vehicle to exist and to be sold.

          • KayLeadfootOP
            link
            fedilink
            32 months ago

            Yea, this subthread it morally ass.

            I don’t think it’s morally wrong to be a sucker. If you fall for the lie, you think you’re actually doing a good thing by using FSD and making the road both safer today and potentially radically safer into the future.

            Problem is, it’s a lie. Regulators exist to sort that shit out for you, car accidents are rare enough that the risk is hard to evaluate as a lone-gun human out here. The regulators biffed this one about as hard as an obvious danger can be biffed.

      • Psychadelligoat
        link
        fedilink
        English
        12 months ago

        I give 0 ducks about Nazis who drive the Nazi car. The more of them that oven themselves in them the better

  • @[email protected]
    link
    fedilink
    English
    142 months ago

    I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

    • @[email protected]
      link
      fedilink
      English
      212 months ago

      Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.