Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

    • @[email protected]
      link
      fedilink
      English
      163 months ago

      In short because Elon (wrongly) believes you only need cameras, he made the claim people also drive with just 2 eyes.

      The thing is, we recognize a truck with stickers of a stopsign, while AI vision gets confused.

      Waymo (Googles self driving side hussle) was build on lidar and other sensors and has been using robot taxis for many years now in geofenced specific areas.

      • ferret
        link
        fedilink
        English
        13 months ago

        The funny thing is, apparently our depth perception, a product of our two eyes, is a feature beyond the reach of tesla. And it would have allowed to to complete this test.

      • @[email protected]
        link
        fedilink
        English
        93 months ago

        The thing is, we recognize a truck with stickers of a stopsign, while AI vision gets confused.

        Lmao would it be illegal to put a stop sign on the back of your car?

        • @[email protected]
          link
          fedilink
          English
          23 months ago

          Some school buses have a sticker / sign on the back that says “I stop for railroad crossings” and can have a stop sign on said sticker.

        • @[email protected]
          link
          fedilink
          English
          83 months ago

          I was thinking the same thing. What would happen if you popped one out of the back of your car while driving in front of a self driving car on the freeway?

  • @[email protected]
    link
    fedilink
    503 months ago

    Things that happen when you rely exclusively on optical sensors, i.e. cameras. But that’s just cheaper, more money for Nazi Elon.

    • Inkstain (they/them)
      link
      fedilink
      English
      73 months ago

      Are we reeeeally sure optical sensors with fast image recognition software are cheaper than LiDAR?

      • @[email protected]
        link
        fedilink
        English
        43 months ago

        The hardware is, which is the important part at scale: even if the code is 10x more expensive when you sell millions of the car it becomes pennies/car

  • @[email protected]
    link
    fedilink
    English
    03 months ago

    They obviously pre-cut the wall, probably for safety reasons, and they were like, let’s make it a silly cartoon impact hole while we’re at it.

    Good job.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      3 months ago

      You think you’re reliably going to notice this after a hundred miles of driving? (X) doubt.

    • KayLeadfootOP
      link
      fedilink
      313 months ago

      It’s dirt cheap, too. If this was a cost-cutting measure, it was a thoroughly idiotic one. Which feels like the mark… of a certain someone I can think of

    • @[email protected]
      cake
      link
      fedilink
      English
      03 months ago

      They are so expensive too! /s

      Who would have known electronics gets cheaper all the time?? /j

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            Bahaha, what kind of a bizarre statement is that?

            Was he trying to imply the government only uses spreadsheets and nosql DBs?

            Or did he think it was necessary to point out that your average government employee isn’t writing their own SQL to grab data they need?

            • @[email protected]
              link
              fedilink
              English
              23 months ago

              Even then, that’s not really correct. People grab data through sql queries all the time. Mostly because all the front ends are trash.

            • snooggums
              link
              fedilink
              English
              33 months ago

              Someone said something he didn’t like so he blurted out the first ignorant thing that he thought of, as usual.

        • Echo Dot
          link
          fedilink
          English
          7
          edit-2
          3 months ago

          He’s said humans don’t use LiDAR so his cars shouldn’t have to. Of course humans have a brain, and he’s cars don’t, but you can’t tell him anything.

            • @[email protected]
              link
              fedilink
              English
              23 months ago

              I think it’s also reasonable to say a human dying because of their own actions is different than a human dying because a big corp cut costs on safety features in an entirely autonomous car where the human has no ability to stop what’s happening. (You can control them in current teslas, but they’re working on cars without human controls as well)

      • @[email protected]
        link
        fedilink
        English
        03 months ago

        It was removed because it was giving false positives. They should have upgraded it with lidar but decided to just remove it.

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        3 months ago

        Tesla had camera+radar+sonar, and that wasn’t their own tech - they used mobileye EyeQ back then. When they switched to in house tech they gradually ditched the radar and sonar which made no sense to me. But at the time I saw their lead say in an interview that this is superior and I believed. not anymore.

        they said doing so cut costs but obviously lidar/radar/sonar only gets cheaper over time, let alone the extra r&d costs because a vision only system is much more difficult to develop.

  • @[email protected]
    link
    fedilink
    English
    163 months ago

    Insurance fraud is going to bankrupt Tesla robotaxis faster than an incompetent CEO ever could.

    There will be too many ways to defeat the cameras and not having LiDAR unlike the rest of the industry may prove to be found to be a failure of duty of care.

  • Magnus
    link
    fedilink
    English
    213 months ago

    I remember Elon foolishly saying his cars don’t need radar or lidar. Even software-disabling radar in cars that already had the hardware.

    • @[email protected]
      link
      fedilink
      English
      203 months ago

      Not even just his cars, he thinks the MILITARY, doesn’t need radar and can just use cameras to spot and track stealth fighters.

      He’s a fucking lunatic.

      • @[email protected]
        link
        fedilink
        English
        33 months ago

        As an augmentation, the ability to spot and track objects visually would be amazing.
        But then planes just have to fly above 10k ft, and pretty much guaranteed cloud cover.

  • @[email protected]
    link
    fedilink
    English
    283 months ago

    I seem to recall that fElon prevented the self driving team from utilizing LIDAR for any part of the system, instead demanding that everything run off of optical input. Does anyone else remember the same?

      • @[email protected]
        link
        fedilink
        English
        03 months ago

        Did he want to cut costs or did he want a network of cameras at his control all over the world?

      • @[email protected]
        link
        fedilink
        English
        93 months ago

        Funny thing is, the price of lidar is dropping like a stone; they are projected to be sub-$200 per unit soon. The technical consensus seems to be settling in on 2 or 3 lidars per car plus optical sensors, and Chinese EV brands are starting to provide self driving in baseline models, with lidars as part of the standard package.

    • @[email protected]
      link
      fedilink
      English
      33 months ago

      Yes, I recall at the time experts saying it was a terrible mistake and Elon saying Machine learning will bridge the gap.

      The real reason was to increase margins.

    • Kokesh
      link
      fedilink
      English
      23 months ago

      Came here to actually write this. Everyone remembers that. He made Tesler the hated shit it is today.

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        As a space nut I seriously hope that he never gets a chance to do anything similar with SpaceX. Thankfully he’s mostly been kept away from important things thus far.

        Don’t get me wrong, I know SpaceX’s closet is overflowing with skeletons. But since Congress has been so kind as to continuously cut NASA’s budget for the last few decades, I have to rely on SpaceX and other private companies to keep our space endeavors going.

    • NιƙƙιDιɱҽʂ
      link
      fedilink
      English
      6
      edit-2
      3 months ago

      What’s cool is that Teslas used to have radar sensors, at least, but Elon removed them from production to save money. Even if you have a car from back then, the software no longer uses them and they’ll just physically unplug them the next time you have the car serviced, as it’s just a drain on the battery at this point 🙃

      • The Quuuuuill
        link
        fedilink
        English
        53 months ago

        meanwhile our subaru has lidar for adaptive cruise control and emergency braking

        • @[email protected]
          link
          fedilink
          English
          7
          edit-2
          3 months ago

          I didn’t realize EyeSight had different versions, on the Solterra it looks like it is indeed LIDAR.

          My Crosstrek has the older dual camera setup for depth perception, it would not be fooled by a picture of a road on a wall… I’m surprised the Teslas are.

    • paraphrand
      link
      fedilink
      English
      13 months ago

      I remember there being claims from him or his team about lidar being a dead end that would not scale as well as computer vision.

      • @[email protected]
        cake
        link
        fedilink
        English
        43 months ago

        I believe he claimed that since humans use their vision to drive that computer vision was more than enough.

        I don’t know about you, but I also rely on sounds & feel when I drive. I also know that the human eye has evolved to detect motion, filter out extraneous information, and send just the important bits to the brain so that it doesn’t get overloaded with everything the eye sees. Computer vision is the exact opposite from that, having to process every bit of every image the camera sees.

        • @[email protected]
          link
          fedilink
          English
          03 months ago

          I don’t know about you, but I also rely on sounds & feel when I drive.

          Of course. When I feel myself driving into a wall, I stop immediately.

        • bluGill
          link
          fedilink
          33 months ago

          I also know of many times my vision fails. Driving into a sunrise for example

        • Terrasque
          link
          fedilink
          English
          43 months ago

          since humans use their vision to drive that computer vision was more than enough

          Surprised he didn’t swap out the wheels with legs while he was at it

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      3 months ago

      Yes. He took too much inspiration from Stanford University’s “Stanley” winning the DARPA Grand Challenge in 2005. This was an early completion to build viable autonomous vehicles. Most of them looked like tanks covered in radar dishes but Stanford wound up taking home the gold with just an SUV with cameras on it.

      It was an impressive achievement in computer vision, and the LiDAR-encrusted vehicles wound up looking like over-complex dinosaurs. There’s a great documentary about it narrated by John Lithgow (who, throughout it, pronounces the word robot as “ro-butt”). Elon watched it, made up his mind, and like a moron, hasn’t changed it in 20 years. I’m almost Musk’s age so I know how the years speed up as we go on. He probably thinks about the Stanford win as something that happened relatively recently. Especially with his mind on - ahem - other things, he’s not keeping up with recent developments out in the real world.

      Rober just made Musk look like the absolute tool he is. And I’m a little worried that we may see people out there staging real world versions of this somehow with actual dangerous obstacles, not a cartoonish foam wall.

      • KayLeadfootOP
        link
        fedilink
        23 months ago

        I did low-key get the squiggles before writing the article. I thought, from an ethical hacking disclosure-type perspective, that this info might cause folks to… well, ya know, paint tunnels on walls.

        Then I looked, the cat was already out of the bag, the video had something like 5 million views on it in the 4 hours it took me to draft the article. So I shared it, but I definitely did have that thought cross my mind. I am also a little worried on that score.

    • Ulrich
      link
      fedilink
      English
      13 months ago

      Tesla never had LIDAR. That’s the little spinny thing you see on Waymo cars. They had RADAR, and yes it was removed in 2021 due to supply shortages and just…never reinstalled.

  • @[email protected]
    link
    fedilink
    English
    23 months ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • @[email protected]
      link
      fedilink
      English
      03 months ago

      You can’t sue me for my driverless car that drops caltrops, forcing everyone else to take the train.

      • @[email protected]
        link
        fedilink
        English
        03 months ago

        I’ve said for a while that you could shut down an entire city with just a few buddies and like $200 in drywall screws. Have each friend drive on a different highway (or in a different direction on each highway) and sprinkle drywall screws as they go. Not just like a single dump, but a good consistent scatter so the entire highway is a minefield and takes hours to properly sweep up.

      • @[email protected]
        link
        fedilink
        English
        23 months ago

        A camera will show it as being more convincing than it is. It would be way more obvious in real life when seen with two eyes. These kinds of murals are only convincing from one specific point.

        • @[email protected]
          link
          fedilink
          English
          03 months ago

          That’s true, but it’s still way more understandable that a car without lidar would be fooled by it. And there is no way you would ever come into such a situation, whereas the image in the thumbnail, could actually happen. That’s why it’s so misleading, can people not see that?
          I absolutely hate Elon Musk and support boycott of Tesla and Starlink, but this is a bit too misleading even with that in mind.

          • KayLeadfootOP
            link
            fedilink
            43 months ago

            So, your comment got me thinking… surely, in a big country like the US of A, this mural must actually exist already, right?

            Of course it does. It is an art piece in Columbia, S.C: https://img.atlasobscura.com/90srIbBi-XX-H9u6i_RykKIinRXlpclCHtk-QPSHixk/rt:fit/w:1200/q:80/sm:1/scp:1/ar:1/aHR0cHM6Ly9hdGxh/cy1kZXYuczMuYW1h/em9uYXdzLmNvbS91/cGxvYWRzL3BsYWNl/X2ltYWdlcy85ZTUw/M2ZkZDAxZjVhN2Rm/NmVfOTIyNjQ4NjQ0/OF80YWVhNzFkZjY0/X3ouanBn.webp

            A full article about it: https://www.atlasobscura.com/places/tunnelvision

            How would Tesla FSD react to Tunnelvision, I wonder? How would Tesla FSD react to an overturned semi truck with a realistic depiction of a highway on it? JK, Tesla FSD crashes directly into overturned semis even without the image depiction issue.

            I don’t think the test is misleading. It’s puffed up for entertainment purposes, but in being puffed up, it draws attention to an important drawback of optical-only self-driving cars, which is otherwise a difficult and arcane topic to draw everyday people’s attention to.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              3 months ago

              Good find, I must say I’m surprised that’s legal, but it’s probably more obvious in reality, and it has the sun which is probably also pretty obvious to a human.
              But it might fool the Tesla?

              Regarding the semi video: WTF?
              But I’ve said for years that Tesla cars aren’t safe for roads. And that’s not just the FSD, they are inherently unsafe in many really really stupid ways.
              Blinker buttons on the steering wheel. Hidden emergency door handles, emergency breaking for no reason. Distracting screen interface. In Denmark 30% of Tesla 3 fail their first 4 year safety check.
              There have been stats publicized that claim they aren’t worse than other cars, when in fact “other cars” were an average of 10 year older. So the newer cars obviously ought to be safer because they should be in better conditions.

      • @[email protected]
        link
        fedilink
        English
        93 months ago

        still, this should be something the car ought to take into account. What if there’s a glass in the way?

          • Victoria
            link
            fedilink
            English
            23 months ago

            Yes, but Styrofoam probably damages the car less than shards of glass.

            • snooggums
              link
              fedilink
              English
              13 months ago

              Glass is far more likely to cause injuries to the driver or the people around the set, just from being heavier material than styrofoam.

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          Yes, I think a human driver who isn’t half asleep would notice that something is weird, and would at least slow down.

      • @[email protected]
        link
        fedilink
        English
        03 months ago

        As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.

        From the perspective of the car, it’s almost perfectly lined up with the background. it’s a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn’t have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn’t be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.

        This test, in the context of the title of this article, relies on a fairly dumb pretense that:

        1. Computers think like humans
        2. This is a realistic situation that a human driver would find themselves in (or that realistic paintings of very specific roads exist in nature)
        3. There is no chance this could be trained out of them. (If it mattered enough to do so)

        This doesnt just affect teslas. This affects any car that uses AI assistance for driving.

        Having said all that… fuck elon musk and fuck his stupid cars.

        • @[email protected]
          link
          fedilink
          English
          03 months ago

          I agree that this just isn’t a realistic problem, and that there are way more problems with Tesla’s that are much more realistic.

          • @[email protected]
            link
            fedilink
            English
            03 months ago

            Tell that to the guy who lost his head when his Tesla thought a reflective semi truck was the sky

              • @[email protected]
                link
                fedilink
                English
                13 months ago

                It’s the same issue, the car not being able to detect a solid object in front of it because of an optical illusion

        • KayLeadfootOP
          link
          fedilink
          13 months ago

          I am fairly dumb. Like, I am both dumb and I am fair-handed.

          But, I am not pretentious!

          So, let’s talk about your points and the title. You said I had fairly dumb pretenses, let’s talk through those.

          1. The title of the article… there is no obvious reason to think that I think computers think like humans, certainly not from that headline. Why do you think that?
          2. There are absolutely realistic situations exactly like this, not a pretense. Don’t think Loony Tunes. Think 18 wheeler with a realistic photo of a highway depicted on the side, or a billboard with the same. The academic article where 3 PhD holding engineering types discuss the issue at length, which is linked in my article. This is accepted by peer-reviewed science and has been for years.
          3. Yes, I agree. That’s not a pretense, that’s just… a factually correct observation. You can’t train an AI to avoid optical illusions if its only sensor input is optical. That’s why the Tesla choice to skip LiDAR and remove radar is a terminal case of the stupids. They’ve invested in a dead-end sensor suite, as evidenced by their earning the title of Most Lethal Car Brand on the Road.

          This does just impact Teslas, because they do not use LiDAR. To my knowledge, they are the only popular ADAS in the American market that would be fooled by a test like this.

          Near as I can tell, you’re basically wrong point by point here.

          • @[email protected]
            link
            fedilink
            English
            0
            edit-2
            3 months ago

            Excuse me.

            1. Did you write the article? I genuinely wasn’t aiming my comment at you. It was merely commentary on the context that is inferred by the title. I just watched a clip of the car hitting the board. I didn’t read the article, so i specified that i was referring to the article title. Not the author, not the article itself. Because it’s the title that i was commenting on.

            2. That wasn’t an 18 wheeler, it was a ground level board with a photorealistic picture that matched the background it was set up against. It wasnt a mural on a wall, or some other illusion with completely different properties. So no, i think this extremely specific set up for this test is unrealistic and is not comparable to actual scientific research, which i dont dispute. I dont dispute the fact that the lack of LiDAR is why teslas have this issue and that an autonomous driving system with only one type of sensor is a bad one. Again. I said i hate elon and tesla. Always have.

            All i was saying is that this test, which is designed in a very specific way and produces a very specific result, is pointless. Its like me getting a bucket with a hole in and hypothesising that if i pour in waterz it will leak out of the hole, and then proving that and saying look! A bucket with a hole in leaks water…

            • KayLeadfootOP
              link
              fedilink
              13 months ago

              Y’all excused, don’t sweat it! I sure did write the article you did not read. No worries, reading bores me sometimes, too.

              Your take is one of the sillier opinions that I’ve come across in a minute. I won’t waste any more time explaining it to you than that. The test does not strike informed individuals as pointless.

              • @[email protected]
                link
                fedilink
                English
                1
                edit-2
                3 months ago

                I dodnt not read it because “reading bores me.” i didn’t read it because i was busy. I have people round digging up my driveway, i have a 7 week old baby and a 5 year old son destroying the house :p i have prep for work and i just did a bit of browsing and saw the post. Felt compelled to comment for a brief break.

                Im not sure what you mean by “silly opinion.” Everyone who has been arguing with me has been stating that everyone knows that teslas dont use LiDAR, and thats why this test failed. If everyone knows this, then why did it need proving. It was a pointless test. Did you know: fire is hot and water is wet? Did you know we need to breathe air to live?

                No?

                Better make an elaborate test, film it, edit the video, make it last long enough to monetise, post it to youtube, and let people write articles about it to post to other websites. All to prove what everyone already knows about a dangerous self driving car that’s been around for 11 years…

                I am sorry, i just dont get it. I felt like I was pointing out the obvious in saying that a test that’s tailored to give a specific result, which we already know the result of, is a farcical test. It’s pointless.

        • @[email protected]
          link
          fedilink
          English
          33 months ago

          This doesnt just affect teslas. This affects any car that uses AI assistance for driving.

          Except for, you know… cars that don’t solely rely on optical input and have LiDAR for example

    • Snot Flickerman
      link
      fedilink
      English
      03 months ago

      I’m so glad I wasn’t the only person who immediately thought “This is some Wile E. Coyote shit.”

      • TJA!
        link
        fedilink
        English
        13 months ago

        I mean, it is also referenced in the article and even in the summary from OP.

  • @[email protected]
    link
    fedilink
    English
    03 months ago

    Don’t want to rock the boat but apart from being a you tube money earner this doesn’t prove or disprove anything. A lot of humans would be fooled by this also.

    I am suspicious of the way the polystyrene wall broke in cartoon like shagged edges, almost like they were precut.

    • Subverb
      link
      fedilink
      English
      13 months ago

      The point of the test is to demonstrate that vision-only, which Tesla has adopted is inadequate. A car with lidar or radar would have been able to “see” that the car was approaching an obstacle without being fooled by the imagary.

      So yes, it seems a bit silly, but the underlying point is legitimate. If the software is fooled by this, then can you ever fully trust it? Especially when sensor systems exist that don’t have this problem at all. Would you want to be a pedestrian in a crosswalk with this car bearing down on you in FSD?

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          They were expecting this result to be possible. What were they supposed to do? Slam the car into the side if a building?

      • @[email protected]
        link
        fedilink
        English
        03 months ago

        Yes but the main point that has been shown is that putting a screen up with the exact copy of the road and surroundings behind the screen is a daft and dangerous idea. It would be a better test if they had put up a polystyrene tree in the middle of the road and then checked if the car stopped.

        I have never driven through a polystyrene wall with a picture of a road on it in 40 years because people just don’t put those things up, they don’t grow on roads etc etc.

        Great YT clip for entertainment though.

          • @[email protected]
            link
            fedilink
            English
            03 months ago

            I have never seen a mural on a road depicting a road that is identical to the road that I am driving on. Hope that helps.

            • @[email protected]
              link
              fedilink
              English
              13 months ago

              Maybe someone should do a follow up experiment to see how different the “mural” would have to be for the car to recognise it. A human would obviously not fall for something like an artistic picture of a fantasy land, but would a Tesla?

              • @[email protected]
                link
                fedilink
                English
                13 months ago

                Ha well done.

                Not sure people will be using self drive around a car park but if that is the plan then I guess you Americans will have to white wash that kind of thing.

    • @[email protected]
      link
      fedilink
      English
      03 months ago

      It may not rise to the level of proof, but it is a memorable and easily understood demonstration of something already proven by car safety researchers, as mentioned in the article.

      Why shouldn’t they precut the wall into cartoony shapes? It adds entertainment and doesn’t compromise the demonstration.

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        Yep agreed. Having used Teslas adaptive cruise control I wouldn’t ever use self driving, not that I have it, unless I had a death wish. Quite honestly my previous Chinese MG was a lot less likely to kill me.

  • @[email protected]
    link
    fedilink
    English
    93 months ago

    This is why it’s fucking stupid Tesla removed Lidar sensors and relies on cameras only.

    But also who would want a tesla, fuck em

    • @[email protected]
      link
      fedilink
      English
      43 months ago

      They never had lidarr. They used to have radar and uss but they decided “vision” was good enough. This conveniently occurred when they had supply chain issues during covid.

    • @[email protected]
      link
      fedilink
      English
      23 months ago

      They also removed radar, which is what allowed them to make all of those “it saw something three vehicles ahead and braked to avoid a pileup that hadn’t even started yet” videos. Removing radar was the single most impactful change Tesla made in regards to FSD, and it’s all because Musk basically decided “people drive fine with just their eyes, so cars should too.”

    • @[email protected]
      link
      fedilink
      English
      93 months ago

      I was horrified when I learned that the autopilot relies entirely on cameras. Nope, nope, nope.

      • @[email protected]
        link
        fedilink
        English
        5
        edit-2
        3 months ago

        Leon said other sensors were unnecessary because human driving is all done through the sense of sight…proving that he has no idea how humans work either (despite purportedly being a human).

  • Mayor Poopington
    link
    fedilink
    English
    83 months ago

    I read something a while back from a guy while wearing a T-shirt with a stop sign on it, a couple robotaxies stopped in front of him. It got me thinking you could cause some chaos walking around with a speed limit 65 shirt.

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      They’re not reading speed limit signs; they’ll follow the speed limit noted on the reference maps, like what you see in the app on your phone.

      • MrScottyTay
        link
        fedilink
        English
        13 months ago

        There’s a lot of cars that check via camera too to double check, for missing/outdated information and for temporary speed limit signs.

      • @[email protected]
        link
        fedilink
        English
        0
        edit-2
        3 months ago

        Yikes, there’s a 25 around here that shows up as a 55 in Google Maps.

        Also a 55 that goes down to I think 35 for just a moment when it joins up with a side road. I wonder what a Tesla would do if it was following that data.

    • @[email protected]
      link
      fedilink
      English
      23 months ago

      I think one of my favorite examples was using simple salt to trap them within the confines of white lines that they didn’t think they could cross over. I really appreciate the imagery of using salt circles to entrap the robotic demons …

    • @[email protected]
      link
      fedilink
      English
      63 months ago

      Teslas did this in the past. There was also the issue of thinking that the moon was a red light or something.