Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • @[email protected]
    link
    fedilink
    English
    353 months ago

    They should just program it to drive through the painted tunnel but when another driver comes behind you they crash into it.

    • KayLeadfootOP
      link
      fedilink
      83 months ago

      The scientists in Ireland calling their data set to prevent this exact fucking thing “Coyote” sent me over the moon.

  • Guy Named ZERO
    link
    fedilink
    English
    13 months ago

    Youtube mad scientist

    Okay come on, Styropyro fits the bill way better for “mad scientist”

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      Styropyro genuinely has a screw loose, explosions and fire also fits the bill. And he’s Australian.

    • SwizzleStick
      link
      fedilink
      English
      13 months ago

      NurdRage ticks the box for me. Also NileRed before he moved out of a garage lab. Still cool though.

      • TXL
        link
        fedilink
        English
        1
        edit-2
        3 months ago

        Yes! Also Photonic Induction, maybe. Big Clive would be hard to label mad, though some of his experiments are in pretty interesting territory.

        • SwizzleStick
          link
          fedilink
          English
          23 months ago

          Big Clive is great for interesting electronics/deathtraps 🙂

    • KayLeadfootOP
      link
      fedilink
      13 months ago

      Aw come on, I thought the lasers versus watermelons demo was a lot of fun, and if that isn’t mad science, I don’t know what is

  • @[email protected]
    link
    fedilink
    English
    83 months ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • @[email protected]
      link
      fedilink
      English
      53 months ago

      Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.

      • @[email protected]
        link
        fedilink
        English
        53 months ago

        There’s a reason GenX trained on hopper. Too bad the newer generations don’t have something equivalent

    • @[email protected]
      link
      fedilink
      English
      33 months ago

      There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.

      • P03 Locke
        cake
        link
        fedilink
        English
        23 months ago

        What is far more likely is that policies simply wont cover accidents due to autonomous systems.

        If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.

        This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          3 months ago

          I don’t know if I believe that people will avoid the risk. Humans are god awful at wrapping their our heads around risk. If the system works well enough that it crashes, let’s say, once in 100,000 miles, many people will probably find the added convenience to be worth the chance that they might be held liable for a collision.

          E, I almost forgot that I am stupid too

      • @[email protected]
        link
        fedilink
        English
        33 months ago

        If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.

    • @[email protected]
      link
      fedilink
      English
      83 months ago

      The day I heard that was the day I realized he’s a fucking idiot and I wanted nothing to do with his cars/tech.

      Judging by how things have turned out…damn was that a good decision lmao

      • Ulrich
        link
        fedilink
        English
        53 months ago

        They pulled the RADAR from mine just before I took delivery, unbeknownst to me at the time. I received no sort of notification.

    • aname
      link
      fedilink
      English
      13 months ago

      I tried watching it and it forces a horrible dubbing over it so I didn’t want to watch it. Apparently only way to chage it is to change my whole youtube account language

      • @[email protected]
        link
        fedilink
        English
        23 months ago

        for the youtube website interface click on the gear wheel, and you can select the audiotrack you want

          • @[email protected]
            link
            fedilink
            English
            33 months ago

            I don’t know if it’s somehow not available to everyone, but I am able to change the audio track on mobile.

    • KayLeadfootOP
      link
      fedilink
      213 months ago

      “But humans can do it with their eyes!” - says the man not selling a human brain to go with the optical sensors

      • @[email protected]
        link
        fedilink
        English
        11
        edit-2
        3 months ago

        “But humans can do it with their eyes!”

        That’s the best part, they kinda can’t.
        There are videos from before they pulled the sensors of some pretty cool stuff where teslas slammed the breaks before anything visibly happened, based on lidar sensors sensing trouble a couple cars up the road, completely blocked to vision.

        super cool safety tech, and then they pulled it…

        one example here https://www.youtube.com/watch?v=BIcC2ZMePKI

        • @[email protected]
          link
          fedilink
          English
          63 months ago

          Pretty sure that wasn’t even lidar. It was radar which is even cheaper and pretty much every other new car has if they don’t have lidar.

      • Ulrich
        link
        fedilink
        English
        43 months ago

        “But humans can do it with their eyes!”

        The thing is, RADAR can see things humans can’t. There was a whole article a while back about a Model X that avoided an otherwise unavoidable accident by bouncing radar under the car in front of it and seeing that car slam on the brakes.

        • Laurel Raven
          link
          fedilink
          English
          63 months ago

          I will point out that if you (or your camera-only driver assist) can’t stop without hitting the car in front of you when they slam on the breaks, then you’re driving too close to them… You really shouldn’t ever put yourself in a position where the person in front of you could cause you to unavoidably hit them.

          That said… Yeah, radar/lidar are far better than camera alone and there’s no good reason not to include them in the sensor suite unless you value profits over lives.

          • Ulrich
            link
            fedilink
            English
            33 months ago

            And I will point out that if the car in front of you isn’t paying attention and rams a stopped car in the middle of the road, you are fucked no matter what.

            • Laurel Raven
              link
              fedilink
              English
              03 months ago

              Not if you have the following distance to stop, but point taken: a crash decelerates you faster than breaks can and typical following distances are assuming breaking distance, not hard sudden halts.

              So increase your following distance. It also has the benefit that it makes it easier to see what’s ahead of the car in front of you.

              There’s pretty much no accident that’s unavoidable (barring someone else plowing into you) if you drive defensively enough (assuming good traction and good breaks, but obviously you should increase your following or decrease your speed to compensate for that as well)

              • Ulrich
                link
                fedilink
                English
                13 months ago

                Not if you have the following distance to stop

                Maintaining a stopping distance like that is nigh impossible in a dense urban area. You’d be constantly cut off and causing tons of traffic.

      • @[email protected]
        link
        fedilink
        English
        73 months ago

        The thing is, yes humans can do it with their eyes. But even with the giant amount of progressing power from the brain they are still not great at it.

        So of the ultimate goal is to the minimum/cheapest to be almost as good as human then yes, optical sensors only are enough.

        Of the goal is to prevent deaths and significantly reduce the number of accidents compared to then lidar is the best option.

        • @[email protected]
          link
          fedilink
          English
          23 months ago

          Very interesting!

          What’s the payoff period, I wonder, assuming everyone could afford optical only before everyone could afford better tech.

    • @[email protected]
      link
      fedilink
      English
      23 months ago

      I’m kinda confident that even RADAR + cameras was good enough, but they started shipping cars without it and even shutting off the RADAR in existing cars.

      The main negative about LiDAR is the cost, but that’s quickly going down.

  • Possibly linux
    link
    fedilink
    English
    1
    edit-2
    3 months ago

    The question is could this fool a human

    Also I went and watched the video and he doesn’t seem to even use full self driving for the wall test

      • Possibly linux
        link
        fedilink
        English
        13 months ago

        Are you sure though?

        If you knew to expect a wall it is pretty obvious but if you aren’t expecting a wall it might prove confusing.

        I probably would stop either way.

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          I watched the video. The wall would not fool a human with object permanence.

          Anyone who is fooled, is likely impaired enough that they are not legal to drive.

        • @[email protected]
          link
          fedilink
          English
          33 months ago

          There’s no way the wall would look real as your perspective shifts while yoi over closer to it. Most humans would react to that by at least slowing down.

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        3 months ago

        Many people tend to doze off so much they would absolutely get fooled. I admit I might, too, especially if the wall is made of a material that needs no guy wires to prop it up. They either used digital effects or a very good color grading job, it’s uncanny.

        relevant

  • @[email protected]
    link
    fedilink
    English
    23 months ago

    I tried Waymo when I was visiting LA a few months ago. Genuinely terrific stuff.

    I do not trust Teslas one bit though.

    • @[email protected]
      link
      fedilink
      English
      23 months ago

      waymo’s have almost hit me like three times, and if i were slower, they would have. you are part of the problem. those are killing machines.

      • Dr. Moose
        link
        fedilink
        English
        23 months ago

        almost. Human driver would hit you in similar scenarios probably.

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          almost because I dodged.

          If these vehicles don’t need to obey the procedural laws that are supposed to keep me safe, why should I refrain from smashing them with a hammer, or taking them apart and selling the pieces, or making cool sculptures out of them?

          aside from being profoundly lazy, of course.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              3 months ago

              these companies do not provide data. they provide blatantly doctored ad copy.

              and the engineers that design them have said there’s no way to make them follow traffic rules perfectly. that they do often sacrifice safety for efficiency when the choice can be made.

              • Dr. Moose
                link
                fedilink
                English
                43 months ago

                What are you even on about tin foil. Waymos are on the road - how is this data doctored? You can literally see that it’s safer than human driving in this exact environment.

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            Where were you that you had to dodge a Waymo? Were you walking down the middle of the road or something?

            • @[email protected]
              link
              fedilink
              English
              03 months ago

              yeah it has to be the victim’s fault. they totally respect crosswalks. how much do they pay you?

              • @[email protected]
                link
                fedilink
                English
                13 months ago

                We’ve heard one side of the story, about a very unlikely series of events. I do think that person is leaving out a few details.

        • @[email protected]
          link
          fedilink
          English
          03 months ago

          yeah, only way it could have happened; has to be the victim’s fault because they totally always respect crosswalks.

          how much do they pay you?

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              3 months ago

              You’re right; I was dressed kinda slutty at least one of those times, so yeah, maybe it was my fault. how about what you’re leaving out? how much do they pay you? you’re all over these responses with similar lines.

    • Ken Oh
      link
      fedilink
      English
      13 months ago

      Tesla doesn’t use lidar for its sensing, living on the prayer that AI will just get good enough soon enough. Absolutely galaxy brained decision.