In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • @[email protected]
    link
    fedilink
    English
    424 days ago

    Painted wall? That’s high tech shit.

    I got a Tesla from my work before Elon went full Reich 3, and try this:

    • break on bridge shadows on the highway
    • start wipers on shadows, but not on rain
    • break on cars parked on the roadside if there’s a bend in the road
    • disengage autopilot and break when driving towards the sun
    • change set speed at highway crossings because fuck the guy behind me, right?
    • engage emergency break if a bike waits to cross at the side of the road

    To which I’ll add:

    • moldy frunk (short for fucking trunk, I guess?), no ventilation whatsoever, water comes in, water stays in
    • pay attention noises for fuck-all reasons masking my podcasts and forcing me to rewind
    • the fucking cabin camera nanny - which I admittedly disabled with some chewing gum
    • the worst mp3 player known to man, the original Winamp was light years ahead - won’t index, won’t search, will reload USB and lose its place with almost every car start
    • bonkers UI with no integration with Android or Apple - I’m playing podcasts via low rate Bluetooth codecs, at least it doesn’t matter much for voice
    • unusable airco in auto mode, insists on blowing cold air in your face

    Say what you want about European cars, at least they got usability and integration right. As did most of the auto industry. Fuck Tesla, never again. Bunch of Steve Jobs wannabes.

  • @[email protected]
    link
    fedilink
    English
    1
    edit-2
    24 days ago

    I bet the reason why he does not want the LiDAR in the car really cause it looks ugly aestheticly.

      • @[email protected]
        link
        fedilink
        English
        024 days ago

        Sorry but I don’t get it. You can getva robot vacuum with lidar for $150. I understand automotive lidars need to have more reliability, range etc. but I don’t understand how it’s not even an option for $30k car.

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          IIRC robot vacuums usually use a single Time of Flight (ToF) sensor that rotates, giving the robot a 2d scan of it’s surroundings. This is sufficient for a vacuum which only needs to operate on a flat surface, but self driving vehicles need a better understanding of their surroundings than just a thin slice.

          That’s why cars might use over 30 distinct ToF sensors, each at a different vertical angle, that are then all placed in the rotating module, giving the system a full 3d scan of it’s surroundings. I would assume those modules are much more expensive, though still insignificant compared to the cost of a car sold on the idea of self driving.

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          You’re car’s not driving indoors at 1mph with the maximum damage being tapping but not marring the wall or vehicle.

          You need high speed, bright lasers, and immense computation to handle outdoor, fast, dangerous work

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      24 days ago

      It costs too much. It’s also why you have to worry about panels falling off the swastitruck if you park next to them. They also apparently lack any sort of rollover frame.

      He doesn’t want to pay for anything, including NHTSB crash tests.

      It’s literally what Drumpf would have created if he owned a car company. Cut all costs, disregard all regulations, and make the public the alpha testers.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        The panels are glued on. The glue fails when the temperature changes.

        I can’t believe that this car is legal to drive in public.

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          Right? It’s also got a cast aluminum frame that breaks if you load the trailer hitch with around 10,000 lbs of downward force. Which means that the back of your Cybertruck could just straight up break off if you’ve frontloaded your trailer and hit a pothole wrong.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        it did cost too much at the time, but currently he doesnt want to do it because he would have to admit hes wrong.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        The guy bankrupted a casino, not by playing against it and being super lucky, but by owning it. Virtually everything he has ever touched in business has turned to shit. How do you ever in the living fuck screwup stakes at Costco? My cousin with my be good eye and a working elbow could do it.

        And now its the country’s second try. This time unhinged, with all the training wheels off. The guy is stepping on the pedal while stripping the car for parts and giving away the fuel. The guy doesn’t even drive, he just fired the chauffeur and is dismantling the car from the inside with a shot gun…full steam ahead on to a nice brick wall and an infinity cliff ready to take us all with him. And Canada and Mexico and Gina. Three and three quarters of a year more of daily atrocities and law breakage. At least Hitler boy brought back the astronauts.

  • FuglyDuck
    link
    fedilink
    English
    2524 days ago

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    This has been known.

    They do it so they can evade liability for the crash.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        Because even braking can’t avoid the crash. Unavoidable crash means bad juju if the ‘self driving’ car image is meant to stick around.

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        23 days ago

        AEB braking was originally designed to not prevent a crash, but to slow the car when a unavoidable crash was detected.

        It’s since gotten better and can also prevent crashes now, but slowing the speed of the crash was the original important piece. It’s a lot easier to predict an unavoidable crash, than to detect a potential crash and stop in time.

        Insurance companies offer a discount for having any type of AEB as even just slowing will reduce damages and their cost out of pocket.

        Not all AEB systems are created equal though.

        Maybe disengaging AP if an unavoidable crash is detected triggers the AEB system? Like maybe for AEB to take over which should always be running, AP has to be off?

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        Breaks require a sufficient stopping distance given the current speed, driving surface conditions, tire condition, and the amount of momentum at play. This is why trains can’t stop quickly despite having breaks (and very good ones at that, with air breaks on every wheel) as there’s so much momentum at play.

        If autopilot is being criticized for disengaging immediately before the crash, it’s pretty safe to assume its too late to stop the vehicle and avoid the collision

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          This autopilot shit needs regulated audit log in a black box, like what planes or ships have.
          In no way should this kind of manipulation be legal.

      • FuglyDuck
        link
        fedilink
        English
        123 days ago

        So, as others have said, it takes time to brake. But also, generally speaking autonomous cars are programmed to dump control back to the human if there’s a situation it can’t see an ‘appropriate’ response to.

        what’s happening here is the ‘oh shit, there’s no action that can stop the crash’, because braking takes time (hell, even coming to that decision takes time, activating the whoseitwhatsits that activate the brakes takes time.) the normal thought is, if there’s something it can’t figure out on it’s own, it’s best to let the human take over. It’s supposed to make that decision well before, though.

        However, as for why tesla is doing that when there’s not enough time to actually take control?

        It’s because liability is a bitch. Given how many teslas are on the road, even a single ruling of “yup it was tesla’s fault” is going to start creating precedent, and that gets very expensive, very fast. especially for something that can’t really be fixed.

        for some technical perspective, I pulled up the frame rates on the camera system (I’m not seeing frame rate on the cabin camera specifically, but it seems to either be 36 in older models or 24 in newer.)

        14 frames @ 24 fps is about 0.6 seconds@36 fps, it’s about 0.4 seconds. For comparison, average human reaction to just see a change and click a mouse is about .3 seconds. If you add in needing to assess situation… that’s going to be significantly more time.

    • @[email protected]
      link
      fedilink
      English
      524 days ago

      Not sure how that helps in evading liability.

      Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.

      • @[email protected]
        link
        fedilink
        English
        524 days ago

        They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

        And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

          The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

          • @[email protected]
            link
            fedilink
            English
            224 days ago

            They can also claim with a straight face that autopilot has a crash rate that is artificially lowered without it being technically a lie in public, in ads, etc

          • FuglyDuck
            link
            fedilink
            English
            423 days ago

            The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

            these strategies aren’t about actually winning the argument, it’s about making it excessively expensive to have the argument in the first place. Every motion requires a response by the counterparty, which requires billable time from the counterparty’s lawyers, and delays the trial. it’s just another variation on “defend, depose, deny”.

      • FuglyDuck
        link
        fedilink
        English
        824 days ago

        It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

        Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.

    • @[email protected]
      link
      fedilink
      English
      024 days ago

      If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

      • FuglyDuck
        link
        fedilink
        English
        123 days ago

        if it randomly turns off for unapparent reasons, people are going to be like ‘oh that’s weird’ and leave it at that. Tesla certainly isn’t going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

        • @[email protected]
          link
          fedilink
          English
          123 days ago

          When I tried it, the only unexpected disengagement was on the highway, but it just slowed and stayed in lane giving me lots of time to take over.

          Thinking about it afterwards, possible reasons include

          • I had cars on both sides, blocking me in. Perhaps it decided that was risky or it occluded vision, or perhaps one moved toward me and there was no room to avoid
          • it was a little over a mile from my exit. Perhaps it decided it had no way to switch lanes while being blocked in
      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        23 days ago

        The given reason is simply that it will return control to the driver if it can’t figure out what to do, and all evidence is consistent with that. All self-driving cars have some variation of this. However yes it’s suspicious when it disengages right when you need it most. I also don’t know of data to support whether this is a pattern or just a feature of certain well-published cases.

        Even in those false positives, it’s entirely consistent with the ai being confused, especially since many of these scenarios get addressed by software updates. I’m not trying to deny it, just say the evidence is not as clear as people here are claiming

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      24 days ago

      That makes so little sense… It detects it’s about to crash then gives up and lets you sort it?
      That’s like the opposite of my Audi who does detect I’m about to hit something and gives me either a warning or just actively hits the brakes if I don’t have time to handle it.
      If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

      • FuglyDuck
        link
        fedilink
        English
        3
        edit-2
        23 days ago

        even your audi is going to dump to human control if it can’t figure out what the appropriate response is. Granted, your Audi is probably smart enough to be like “yeah don’t hit the fucking wall,” but eh… it was put together by people that actually know what they’re doing, and care about safety.

        Tesla isn’t doing this for safety or because it’s the best response. The cars are doing this because they don’t want to pay out for wrongful death lawsuits.

        If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

        It’s musk. he’s fucking vile, and this isn’t even close to the worst thing he’s doing. or has done.

      • @[email protected]
        link
        fedilink
        English
        724 days ago

        The point is that they can say “Autopilot wasn’t active during the crash.” They can leave out that autopilot was active right up until the moment before, or that autopilot directly contributed to it. They’re just purely leaning into the technical truth that it wasn’t on during the crash. Whether it’s a courtroom defense or their own next published set of data, “Autopilot was not active during any recorded Tesla crashes.”

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      23 days ago

      Any crash within 10s of a disengagement counts as it being on so you can’t just do this.

      Edit: added the time unit.

      Edit2: it’s actually 30s not 10s. See below.

      • FuglyDuck
        link
        fedilink
        English
        124 days ago

        Where are you seeing that?

        There’s nothing I’m seeing as a matter of law or regulation.

        In any case liability (especially civil liability) is an absolute bitch. It’s incredibly messy and likely will not every be so cut and dry.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          23 days ago

          Well it’s not that it was a crash caused by a level 2 system, but that they’ll investigate it.

          So you can’t hide the crash by disengaging it just before.

          Looks like it’s actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?

          The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury

          https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

          • FuglyDuck
            link
            fedilink
            English
            323 days ago

            Thanks for that.

            The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.

            Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              23 days ago

              I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

              • @[email protected]
                link
                fedilink
                English
                123 days ago

                Generally things like that are meant more to identify a pattern. It may not be useful to an individual, but very useful to determine a recall or support a class action

          • @[email protected]
            link
            fedilink
            English
            123 days ago

            I get the impression it disengages so that Tesla can legally say “self driving wasn’t active when it crashed” to the media.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              23 days ago

              Except they can’t really because of the above which was explicitly to prevent trickery like that.

  • @[email protected]
    link
    fedilink
    English
    1024 days ago

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

    So, who’s the YouTuber that’s gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

  • Eugene V. Debs' Ghost
    link
    fedilink
    English
    724 days ago

    “Dipshit Nazis mad at facts bursting their bubble is unreality” is another way of reading this headline.

    • @[email protected]
      link
      fedilink
      English
      124 days ago

      I believe the outrage is that the video showed that autopilot was off when they crashed into the wall. That’s what the red circle in the thumbnail is highlighting. The whole thing apparently being a setup for views like Top Gear faking the Model S breaking down.

  • @[email protected]
    link
    fedilink
    English
    024 days ago

    The bar set for self-driving cars: Can it recognize and respond correctly to a deliberate optical illusion?

    The bar set for humans: https://youtu.be/ks11nuGGupI

    For the record, I do want the bar for self-driving safety to be high. I also want human drivers to be better… Because even not-entirely-safe self-driving cars may still be safer than humans at a certain point.

    Also, fuck Tesla.

    • @[email protected]
      link
      fedilink
      English
      024 days ago

      I mean it also plowed through a kid because it was foggy, then rainy. The wall was just one of the tests the tesla failed.

      • @[email protected]
        link
        fedilink
        English
        0
        edit-2
        24 days ago

        Right, those were the failures that really matter, and Rober included the looney tunes wall to get people sharing and talking about it. A scene painted on wall is a contrived edge case, but pedestrians/obstacles in weather involving precipitation is common.

        • @[email protected]
          link
          fedilink
          English
          024 days ago

          It’s no longer an edge case if faulty self driving becomes the norm.

          Want to kill someone in a Tesla? Find a convenient spot and paint a wall there.

          Doesn’t even have to be an artificial wall, for example take a bend on a mountain road and paint the rock.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              24 days ago

              Have you ever seen examples of how the features that ai picks out to identify objects isn’t really the same as what we pick out? So you can generate images that look unrecognizeable to people but have clearly identifiable features to ai. It would be interesting to see someone play around with that concept for interesting ways to fool tesla’s ai. Like could you make a banner that looks like a barricade to people, but the cars think looks like open road?

              This isn’t a great example for this concept, but it is a great video. https://youtu.be/FMRi6pNAoag?t=5m58s

              • @[email protected]
                link
                fedilink
                English
                124 days ago

                I was thinking something that the AI would think the road turns left and humans see it turns right

          • @[email protected]
            link
            fedilink
            English
            123 days ago

            A better trick would be to paint the road going straight when there’s a cliff. Much easier to hide the evidence that way.

  • Banana
    link
    fedilink
    English
    1024 days ago

    And the president is driving one of these?

    Maybe we should be purchasing lots of paint and cement blockades…

    • @[email protected]
      link
      fedilink
      English
      424 days ago

      The president can’t drive by law unless on the grounds of the White House and maybe Camp David. At least while in office. They might be allowed to drive after leaving office…

      • FuglyDuck
        link
        fedilink
        English
        6
        edit-2
        23 days ago

        This isn’t true at all. I can’t tell if you’re being serious or incredibly sarcastic, though.

        The reason presidents (and generally ex presidents, too) don’t drive themselves is because the kind of driving to escape an assassination attempt is a higher level of driving and training than what the vast majority of people ever have. There’s no law saying presidents are forbidden from driving.

        In any case, I would be perfectly happy if they let him drive a CT and it caught fire. I’d do a little jib, and I wouldn’t care who sees that.

          • FuglyDuck
            link
            fedilink
            English
            523 days ago

            you’re gonna have to drop a source for that.

            because, no, they’re not. the Secret Service provides a driver specially trained for the risks a president might face, and very strongly insists, but they’re not “prohibited” from driving simply because they’re presidents.

            to be clear, the secret service cannot prohibit the president from doing anything they really want to do. Even if it’s totally stupid for them to do that. (This includes, for example, Trump’s routine weekend round of golf at Turd-o-Lardo)

            • @[email protected]
              link
              fedilink
              English
              123 days ago

              You’re technically correct, there is no law prohibiting a current or former president from driving, but there is a policy preventing it and it is enforced by the secret service (who follow them around for the rest of their life). Many former presidents have gone on the record that the lose of their driving privileges really sucks (Bush 43, Clinton, and Obama have all discussed it on camera during various interviews). It’s been a policy since Kennedy was assassinated, lots of other policy changes too, but one was the no driving bit.

              Random sources: https://www.smh.com.au/world/us-presidents-can-have-everything--except-the-car-keys-20140506-zr5we.html

              https://www.cnbc.com/2017/06/28/presidents-arent-allowed-to-drive.html

              And one just about some times they drove anyway:

              https://www.motorbiscuit.com/3-u-s-presidents-got-around-no-driving-rule/

              • FuglyDuck
                link
                fedilink
                English
                123 days ago

                Policy can be changed. Quite easily.

                Especially by the president, when it’s about the president.

                Obama, Clinton, others, they don’t really lose their driving privileges. Effectively, they do, sure. But that’s because they’re not utter morons.

                Even trump has yet to prove himself that stupid. He probably is that stupid, but he likes the pomp and circumstance, don’t get me wrong.

                Other policies include screening people for firearms at rallies- trump over ruled that one, that day, too.

            • @[email protected]
              link
              fedilink
              English
              3
              edit-2
              23 days ago

              to be clear, the secret service cannot prohibit the president from doing anything they really want to do

              Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

              I could definitely see him lying about that so he doesn’t look like he abandoned his supporters during the coup, but I could also see the driver being like “I can’t endanger you, mr president” and ignoring his requests.

              • FuglyDuck
                link
                fedilink
                English
                323 days ago

                Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

                Definitely not. There is no way in hell the secret service would have taken the president to that shit show. Doesn’t mean that they would have physically arrested him if he insisted going on his own, however.

        • @[email protected]
          link
          fedilink
          English
          119 days ago

          Mostly sarcastic, but there is a secret service rule that the president is not allowed to drive on public roads. The rest is all debatable because it hasn’t been litigated. They answer to the president, but a president can not refuse secret service protection.

          The current understanding is that they can strongly suggest things, but ultimately, they have to figure it out if the president doesn’t follow.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        The real question is, in a truly self-driving car, (not a tesla) are you actually driving?

            • Echo Dot
              link
              fedilink
              English
              223 days ago

              I imagine when he’s driving around his golf course he makes voom voom noises

            • @[email protected]
              link
              fedilink
              English
              123 days ago

              He would be a much funnier person if he weren’t in a position of power (and thus didn’t have the ability to affect people), especially one as terrifying as being the leader of one of the most powerful nations in the world.

    • @[email protected]
      link
      fedilink
      English
      9
      edit-2
      24 days ago

      When he was in the Tesla asking if he should go for a ride I was screaming “Yes! Yes Mr. President! Please! Elon, show him full self driving on the interstate! Show him full self driving mode!”

  • @[email protected]
    link
    fedilink
    English
    024 days ago

    Tesla cars are stupid tech. As the cars that use lidar demonstrated, this is a solved problem. There don’t have to be self driving cars that run over kids. They just refuse to integrate the solution for no discernible reason, which I’m assuming is really just “Elon said so.”

    • dual_sport_dork 🐧🗡️
      link
      fedilink
      English
      124 days ago

      It’s even worse than that. Not only is it a solved problem, but Tesla had it solved (or closer to solved, anyway) and then intentionally regressed on the technology as a cost cutting measure. All the while making a limp-wristed attempt to spin the removal of key sensor hardware – first the radar and later the ultrasonic proximity sensors – as a “safety” initiative.

      There isn’t a shovel anywhere in the world big enough for that pile of bullshit.

  • @[email protected]
    link
    fedilink
    English
    4424 days ago

    Notice how they’re mad at the video and not the car, manufacturer, or the CEO. It’s a huge safety issue yet they’d rather defend a brand that obviously doesn’t even care about their safety. Like, nobody is gonna give you a medal for being loyal to a brand.

    • @[email protected]
      link
      fedilink
      English
      124 days ago

      me waving a little handheld flag on a tiny pole that just says “Brand loyalty”

      …what? No medal???

    • @[email protected]
      link
      fedilink
      English
      324 days ago

      The styrofoam wall had a pre-cut hole to weaken it, and some people are using it as a gotcha proving the video was faked. It would be funny if it wasn’t so pathetic.

      • @[email protected]
        link
        fedilink
        English
        224 days ago

        Sounds like Rober gets to repeat this with a cinderblock wall and use the car as a tax write off then.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          24 days ago

          Hopefully with a Mythbusters-style remote control setup in case it explodes. And the trunk filled with ANFO to make sure it does.

        • @[email protected]
          link
          fedilink
          English
          424 days ago

          Sounds like Tesla fans should repeat this with cinderblock walls to show us how fake it was.

      • @[email protected]
        link
        fedilink
        English
        424 days ago

        Yeah, but it’s styrofoam. You could literally run through it. And I’m sure they did that more as a safety measure so that it was guaranteed to collapse so nobody would be injured.

        But at the same time it still drove through a fucking wall. The integrity doesn’t mean shit because it drove through a literal fucking wall.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        Yeah, because he knew that thing probably wasn’t gonna stop. Why destroy the car when you don’t have to? Concrete wouldn’t have changed the outcome.

      • scops
        link
        fedilink
        English
        224 days ago

        For more background, Rober gave an interview and admitted that they ran the test twice. On the first run, the wall was just fabric, which did not tear away in a manner that was visually striking. They went back three weeks later and built a styrofoam wall knowing that the Tesla would fail, and pre-cut the wall to create a more interesting impact.

        • @[email protected]
          link
          fedilink
          English
          324 days ago

          Particularly disappointing part of that interview was Rober saying he still plans to buy a new Tesla. Safety issues aside, why would anyone want to do that?

          • @[email protected]
            link
            fedilink
            English
            224 days ago

            Knowing the insanity of die-hard Tesla fans, it’s likely to try and protect himself.

            “I love my Tesla, but” has been a meme for years now because if you ever went on forums to get help or complain what a giant heap of shit the car was, and didn’t bookend it with unabashed praise, you’d have people ripping you to shreds calling you a FUDster and Big Oil shill who’s shorting the stock and trying to destroy the greatest company the world has ever known.

            People have learned over the years that even with the most valid of criticism for the company, the only way to even attempt to have it received is by showing just how much you actually love Tesla and Daddy Elon, and your complaints/criticism are only because you care so much about the company and want them to do better. Yes, it’s fucking stupid and annoying, but sadly this is the reality we’ve created for ourselves.

          • @[email protected]
            link
            fedilink
            English
            0
            edit-2
            24 days ago

            Because the car actually does stop for things that aren’t fake walls made to look like a road, and at least for people as tested by testing agencies

            This is the euro NCAP testing.

            https://youtu.be/4Hsb-0v95R4

            Note: not all of these cars have lidar, but some do.

    • @[email protected]
      link
      fedilink
      English
      623 days ago

      To be fair, and ugh, I hate to have to stand up for these assholes, but…

      To be fair, their claim is that the video was a lie and that the results were manufactured. They believe that Teslas are actually safe and that Rober was doing some kind of Elon Musk takedown trying to profit off the shares getting tanked and promote a rival company.

      They actually do have a little bit of evidence for those claims:

      1. The wall changes between different camera angles. In some angles the wall is simply something painted on canvas. In other angles it’s a solid styrofoam wall.
      2. The inside the car view in the YouTube video doesn’t make it clear that autopilot mode is engaged.
      3. Mark Rober chose to use Autopilot mode rather than so-called Full Self Driving.

      But, he was interviewed about this, and he provided additional footage to clear up what happened.

      1. They did the experiment twice, once with a canvas wall, then a few weeks later with a styrofoam wall. The car smashed right into the wall the first time, but it wasn’t very dramatic because the canvas just blew out of the way. They wanted a more dramatic video for YouTube, so they did it again with a styrofoam wall so you could see the wall getting smashed. This included pre-weakening the wall so that when the car hit it, it smashed a dramatic Looney-Tunes looking hole in the wall. When they made the final video, they included various cuts from both the first and second attempts. The car hit the wall both times, but it wasn’t just one single hit like it was shown in the video.

      2. There’s apparently a “rainbow” path shown when the car is in Autopilot mode. [RAinbows1?!? DEI!?!?!?!] In the cut they posted to YouTube, you couldn’t see this rainbow path. But, Rober posted a longer cut of the car hitting the wall where it was visible. So, it wasn’t that autopilot was off, but in the original YouTube video you couldn’t tell.

      3. He used Autopilot mode because from his understanding (as a Tesla owner (this was his personal vehicle being tested)), Full Self Driving requires you to enter a destination address. He just wanted to drive down a closed highway at high speed, so he used Autopilot instead. In his understanding as a Tesla owner and engineer, there would be no difference in how the car dealt with obstacles in autopilot mode vs. full self driving, but he admitted that he hadn’t tested it, so it’s possible that so-called Full Self-Driving would have handled things differently.

      Anyhow, these rabid MAGA Elon Fanboys did pick up on some minor inconsistencies in his original video. Rober apprently didn’t realize what a firestorm he was wading into. His intention was to make a video about how cool LIDAR is, but with a cool scene of a car smashing through a wall as the hook. He’d apparently been planning and filming the video for half a year, and he claims it just happened to get released right at the height of the time when Teslas are getting firebombed.

    • @[email protected]
      link
      fedilink
      English
      1224 days ago

      These people haven’t found any individual self identity.

      An attack on the brand is an attack on them. Reminds me of the people who made Stars Wars their meaning and crumbled when a certain trilogy didn’t hold up.

        • @[email protected]
          link
          fedilink
          English
          224 days ago

          Important to note, this is a human weakness and not a <political group that isn’t mine> weakness.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        So literally every single above average sports fan?

        The pathological need to be part of a group so bad it overwhelmes all reason is a feature I have yet to understand. And I say that as someone who can recognize in myself those moments when I feel the pull to be part of an in group.

      • Billiam
        link
        fedilink
        English
        824 days ago

        An attack on the brand is an attack on them.

        Thus it ever is with Conservatives. They make $whatever their whole identity, and so take any critique of $whatever as a personal attack against themselves.

        I blame evangelical religions’ need for martyrdom for this.

        • @[email protected]
          link
          fedilink
          English
          224 days ago

          You pretty much hit the nail on the head. These people have no identity or ability to think for themselves because they never needed either one. The church will do all your thinking for you, and anything it doesn’t cover will be handled by Fox News. Be like everyone else and fit in, otherwise… you have to start thinking for yourself. THE HORROR.

        • @[email protected]
          link
          fedilink
          English
          323 days ago

          “Mark my word, if and when these preachers get control of the [Republican] party, and they’re sure trying to do so, it’s going to be a terrible damn problem. Frankly, these people frighten me. Politics and governing demand compromise. But these Christians believe they are acting in the name of God, so they can’t and won’t compromise. I know, I’ve tried to deal with them.” ― Barry Goldwater

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        Kinda depends on the fact, right? Plenty of factual things piss me off, but I’d argue I’m correct to be pissed off about them.

  • comfy
    link
    fedilink
    English
    12323 days ago

    I hope some of you actually skimmed the article and got to the “disengaging” part.

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

    • 74 183.84
      link
      fedilink
      English
      7
      edit-2
      23 days ago

      It always is that way; fuck the consumer, its all about making a buck

    • NιƙƙιDιɱҽʂ
      link
      fedilink
      English
      023 days ago

      I’ve heard that too, and I don’t doubt it, but watching Mark Rober’s video, it seems like he’s deathgripping the wheel pretty hard before the impact which seems more likely to be disengaging. Each time, you can see the wheel tug slightly to the left, but his deathgrip pulls it back to the right.

    • @[email protected]
      link
      fedilink
      English
      2323 days ago

      Don’t get me wrong, autopilot turning itself off right before a crash is sus and I wouldn’t put it past Tesla to do something like that (I mean come on, why don’t they use lidar) but maybe it’s so the car doesn’t try to power the wheels or something after impact which could potentially worsen the event.

      On the other hand, they’re POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.

      • @[email protected]
        link
        fedilink
        English
        1223 days ago

        Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.

      • 74 183.84
        link
        fedilink
        English
        1223 days ago

        I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt

        • FuglyDuck
          link
          fedilink
          English
          823 days ago

          … It shutting off before hand feels like a cheap ploy to avoid guilt

          that’s exactly what it is.

      • @[email protected]
        link
        fedilink
        English
        723 days ago

        Rober seems to think so, since he says in the video that it’s likely disengaging because the parking sensors detect that it’s parked because of the object in front, and it shuts off the cruise control.

      • @[email protected]
        link
        fedilink
        English
        3323 days ago

        if it can actually sense a crash is imminent, why wouldn’t it be programmed to slam the brakes instead of just turning off?

        Do they have a problem with false positives?

        • @[email protected]
          link
          fedilink
          English
          1023 days ago

          I’ve been wondering this for years now. Do we need intelligence in crashes, or do we just need vehicles to stop? I think you’re right, it must have been slamming the brakes on at unexpected times, which is unnerving when driving I’m sure.

        • @[email protected]
          link
          fedilink
          English
          1123 days ago

          if it was european made, it would slam the brakes or swerve in order to at least try and save lives since governments attempt to regulate companies to not do evil shit. Since it american made it is designed to maximise profit for shareholders.

          • @[email protected]
            link
            fedilink
            English
            10
            edit-2
            23 days ago

            I don’t believe automatic swerving is a good idea, depending on what’s off to the side it has the potential to make a bad situation much worse.

            I’m thinking like, kid runs into the street, car swerves and mows down a crowd on the sidewalk

            • @[email protected]
              link
              fedilink
              English
              523 days ago

              Its the cars job to swerve into a less dangerous place.

              Can’t do that? Oops, no self-driving for you.

      • Krzd
        link
        fedilink
        English
        1223 days ago

        Wouldn’t it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it’s certain enough that there will be an accident, just applying the brakes until there’s user override would make much more sense…

        • @[email protected]
          link
          fedilink
          English
          123 days ago

          False positives. Most like it detected something was off (parking sensor detected something for example) but doesn’t have high confidence it isn’t an erroneous sensor reading. You don’t want the car slamming on brakes at highway speed for no reason and causing a multi car pileup.

    • @[email protected]
      link
      fedilink
      English
      39
      edit-2
      23 days ago

      It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

      That is like writing musk made an awkward, confused gesture during a time a few people might call questionable timing and place.

    • @[email protected]
      link
      fedilink
      English
      623 days ago

      Yeah but that’s milliseconds. Ergo, the crash was already going to happen.

      In any case, the problem with Tesla autopilot is that it doesn’t have radar. It can’t see objects and there have been many instances where a Tesla crashed into a large visible object.

      • @[email protected]
        link
        fedilink
        English
        423 days ago

        That’s what’s confusing me. Rober’s hypothesis is without lidar the Tesla couldn’t detect the wall. But to claim that autopilot shut itself off before impact means that the Tesla detected the wall and decided impact was imminent, which disproves his point.

        If you watch the in car footage, autopilot is on for all of three seconds and by the time its on impact was already going to happen. That said, teslas should have lidar and probably do something other than disengage before hitting the wall but I suspect their cameras were good enough to detect the wall through lack of parallax or something like that.

        • @[email protected]
          link
          fedilink
          English
          223 days ago

          But to claim that autopilot shut itself off before impact means that the Tesla detected the wall and decided impact was imminent, which disproves his point.

          Completely disagree. You are assuming the same sensors that handle autopilot are the same sensors that disengage it when detecting close proximity. The fact that it happened the instant before he connected kind of shows that at a very close distance something is detecting an impact and cutting it off. If it knew ahead of time it would have stopped well ahead of time.

          The original goal also wasn’t to uncover this, it was just to compare it to lidar per the article. I’m guessing we’re going to see a ton more things pop up testing this claim, and we’re likely to see tesla push an OTA update that changes the behavior so that people can’t easily reproduce it.

        • @[email protected]
          link
          fedilink
          English
          623 days ago

          Or it still may have short distance sensors for parking and that if it sees something solid on those it disables autopilot?

  • MochiGoesMeow
    link
    fedilink
    English
    424 days ago

    If you get any strong emotions on material shit when someone makes a video…you have 0 of my respect. Period.

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      24 days ago

      Saw a guy smash a Stradivarius on video once. definitely had strong emotions on that one.

      Really torn up about not having your respect tho…

      • @[email protected]
        link
        fedilink
        English
        224 days ago

        I think you could argue that that’s not just material stuff though. That’s historical and significant culturally.

  • @[email protected]
    link
    fedilink
    English
    124 days ago

    What would definitely help with the discussion is if Mark Rober the scientist left a fucking crumb of scientific approach in his video. He didn’t really explain how he was testing it just slam car into things for views. This and a collaboration with a company that makes lidar made the video open to every possible criticism and it’s a shame.

    Discovery channel level of dumbed down „science”.

    • @[email protected]
      link
      fedilink
      English
      124 days ago

      Actually, his methodology was very clearly explained. Did you watch the whole video? He might have gushed a bit less about LiDAR but otoh the laymen don’t know about it so it stands to reason he had to explain the basics in detail.

    • @[email protected]
      link
      fedilink
      English
      0
      edit-2
      24 days ago

      Okay, but what would you like him to elaborate on, other than showing you that the Tesla is fooled by a road runner type mural, fog and dense rain?

      How much more info other than just “car didn’t stop” (where other car did stop) do you need to be convinced this is a problem?

      • @[email protected]
        link
        fedilink
        English
        0
        edit-2
        24 days ago

        I have no doubt the car will crash.

        But I do feel there is something strange about the car disengaging the auto pilot (cruise control) just before the crash. How can the car know it’s crashing while simultaneously not knowing it’s crashing?

        I drive a model 3 myself, and there is so much bad shit about the auto pilot and rain sensors. But I have never experienced, or heard anyone else experiencing a false positive were the car disengage the auto pilot under any conditions the way shown in the video with o sound or visual cue. Considering how bad the sensors on the car is, its strange they’re state of the art every time an accident happens. There is dissonance between the claims.

        Mark shouldn’t have made so many cuts in the upload. He locks the car on 39mph on the video, but crashes at 42mph. He should have kept it clean and honest.

        I want to see more of these experiments in the future. But Marks video is pretty much a commercial for the Lidar manufacturer. And commercials shouldn’t be trusted.

      • @[email protected]
        link
        fedilink
        English
        024 days ago

        So Tesla owners have a monopoly on caring about the process of an experiment?

        A logic conclusion by that is anyone not a Tesla owner is incapable of critical thought?

        How is this a win?

          • @[email protected]
            link
            fedilink
            English
            0
            edit-2
            24 days ago

            I have no doubt the car will crash.

            But I do feel there is something strange about the car disengaging the auto pilot (cruise control) just before the crash. How can the car know it’s crashing while simultaneously not knowing it’s crashing?

            I drive a model 3 myself, and there is so much bad shit about the auto pilot and rain sensors. But I have never experienced, or heard anyone else experiencing a false positive were the car disengage the auto pilot under any conditions the way shown in the video with o sound or visual cue. Considering how bad the sensors on the car is, its strange they’re state of the art every time an accident happens. There is dissonance between the claims.

            Mark shouldn’t have made so many cuts in the upload. He locks the car on 39mph on the video, but crashes at 42mph. He should have kept it clean and honest.

            I want to see more of these experiments in the future. But Marks video is pretty much a commercial for the Lidar manufacturer. And commercials shouldn’t be trusted.

      • @[email protected]
        link
        fedilink
        English
        124 days ago

        I fucking hate tesla and elon musk. Also I fucking hate people calling unverifiable shit science

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          Well, it was published, up to you to do a peer review I guess!

          Also, this isn’t needing science, it blatantly shows that things does infact not function as intended.

          • @[email protected]
            link
            fedilink
            English
            0
            edit-2
            24 days ago

            Were is a robust description of the experiment? Or am I supposed to look frame by frame at the screen in the car to deduce the testing conditions?

            All he had to do was tell us clearly what is enabled on each car and what his inputs are. That would solve all the tesla fanbois comments about him cheating. Maybe he didn’t for „engagement”.

        • @[email protected]
          link
          fedilink
          English
          124 days ago

          You’re upset that made up people in your head called this video a research project or something? Because the closest thing I could find to what you’re complaining about is his YouTube channel’s description where it says “friend of science”.

          He never claimed to be a scientist, doesn’t claim to be doing scientific research. In his own words, he’s just doing some tests on his own car. That’s it.

  • @[email protected]
    link
    fedilink
    English
    123 days ago

    It was super annoying how scared he acted when he knew it was styrofoam and it wasn’t even going to leave a scratch on the car. I would have like it much better if the car crashed into and actual wall and burst into flames.

    • @[email protected]
      link
      fedilink
      English
      823 days ago

      Instinctively, human brains generally don’t like large objects coming to them unbidden at high speed. That isn’t going to help things, even if you’re consciously aware that the wall is relatively harmless.