Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 hour ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 hour ago

      There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        33 minutes ago

        What is far more likely is that policies simply wont cover accidents due to autonomous systems.

        If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.

        This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      44 minutes ago

      If no one is liable then it’s tempting to deliberately confuse them to crash

  • King3d@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 minutes ago

    This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.

  • pjwestin@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 hours ago

    To be fair, the roadrunner it was following somehow successfully ran into the painting.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 hour ago

    To be fair, I’d be surprised if half the humans driving didn’t do the same.

  • Kane@femboys.biz
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 hours ago

    Can this be solved with just cameras, or would this need additional hardware? I know they removed LIDAR, but thought that would only be effective short range, and would not be too helpful at 65 km/h.

    • bitchkat@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      59 minutes ago

      Teslas never had LIDAR. They did have ultrasonic sensors and radar before they went to the this vision only crap.

    • Overtheveloper@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 hour ago

      If for some bizarre reason you would want to stick to cameras only, you could use 2 cameras and calculate the distance to various points based on the difference between the images. Thats called stereoscopy and is precisely what gives our brains depth perception. The issue is that this process is expensive computationally so I’d guess that it would be cheaper to go back to lidar.

    • toddestan@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 hours ago

      Theoretically, yes. A human would be smart enough not to drive right into a painted wall, using only their eyeballs combined with their intelligence and sense of self-preservation. A smart enough vision system should be able to do the same.

      Using something like LIDAR to directly sense obstacles would a lot more practical and reliable. LIDAR certainly has enough distance (airplanes use it too), though I don’t know about the systems Tesla used specifically.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 hours ago

      Good question. I don’t know if they ll succeed but they have a point that humans do it with just vision so why can’t ai do at least as well? We’ll see. I’m happy someone is trying a different approach. Maybe lidar is necessary, but until someone succeeds we won’t know the best approach, so let’s be happy there’s at least one competing attempt

      I gave it a try once and it was pretty amazing, but clearly not ready. Tesla is fantastic at “normal” driving, but the trial gave me a real appreciation how driving is all edge cases. At this point I’m no longer confident that anyone will solve the problem adequately for general use.

      Plus there will be accidents. No matter how optimistic you may be, it will never be perfect. Are they ready for the liability and reputation hit? Can any company survive that, even if they are demonstrably better than human?

      • bitchkat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        57 minutes ago

        It works pretty well as a highway assist. I never use it on city streets because its so slow and hesitant which is worse.

  • arankays@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    5 hours ago

    I tried Waymo when I was visiting LA a few months ago. Genuinely terrific stuff.

    I do not trust Teslas one bit though.

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    1 hour ago

    It is a very high bar for FSD to force it to deal with intentional sabotage of FSD.

      • humanspiral@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        25 minutes ago

        The biggest problem will always be a backdoor that allows remote control of the car for purposes of killing the driver or other people. The Wile E Coyote attack is much more expensive and puts attacker in jeopardy for the time involved in constructing the “trap”.

    • nomy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 hour ago

      People will definitely fuck with autonomous cars though so you have to plan for it.