Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • @[email protected]
    link
    fedilink
    English
    21 month ago

    I tried Waymo when I was visiting LA a few months ago. Genuinely terrific stuff.

    I do not trust Teslas one bit though.

    • @[email protected]
      link
      fedilink
      English
      21 month ago

      waymo’s have almost hit me like three times, and if i were slower, they would have. you are part of the problem. those are killing machines.

      • Dr. Moose
        link
        fedilink
        English
        21 month ago

        almost. Human driver would hit you in similar scenarios probably.

        • @[email protected]
          link
          fedilink
          English
          11 month ago

          almost because I dodged.

          If these vehicles don’t need to obey the procedural laws that are supposed to keep me safe, why should I refrain from smashing them with a hammer, or taking them apart and selling the pieces, or making cool sculptures out of them?

          aside from being profoundly lazy, of course.

          • @[email protected]
            link
            fedilink
            English
            11 month ago

            Where were you that you had to dodge a Waymo? Were you walking down the middle of the road or something?

            • @[email protected]
              link
              fedilink
              English
              01 month ago

              yeah it has to be the victim’s fault. they totally respect crosswalks. how much do they pay you?

              • @[email protected]
                link
                fedilink
                English
                11 month ago

                We’ve heard one side of the story, about a very unlikely series of events. I do think that person is leaving out a few details.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              1 month ago

              these companies do not provide data. they provide blatantly doctored ad copy.

              and the engineers that design them have said there’s no way to make them follow traffic rules perfectly. that they do often sacrifice safety for efficiency when the choice can be made.

              • Dr. Moose
                link
                fedilink
                English
                41 month ago

                What are you even on about tin foil. Waymos are on the road - how is this data doctored? You can literally see that it’s safer than human driving in this exact environment.

        • @[email protected]
          link
          fedilink
          English
          01 month ago

          yeah, only way it could have happened; has to be the victim’s fault because they totally always respect crosswalks.

          how much do they pay you?

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              1 month ago

              You’re right; I was dressed kinda slutty at least one of those times, so yeah, maybe it was my fault. how about what you’re leaving out? how much do they pay you? you’re all over these responses with similar lines.

    • Ken Oh
      link
      fedilink
      English
      11 month ago

      Tesla doesn’t use lidar for its sensing, living on the prayer that AI will just get good enough soon enough. Absolutely galaxy brained decision.