• melfie@lemmings.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    11 minutes ago

    Self-driving in general has been overhyped by grifter tech bros like Elon and really shows the current limits of ML. Today, ML models are basically fuzzy, probabilistic functions that map inputs to outputs and are not capable of actual reasoning. There is a long tail of scenarios where a self-driving car will not generalize properly (i.e., will kill people). Throwing increasingly more data and compute at it won’t suddenly make it capable of reasoning like a human. Like other ML use cases, self-driving is a cool concept that can be put to good use under the right conditions, and can even operate mostly without human supervision. However, anyone claiming it’s safe to let today’s “self-driving” cars shuttle humans around at high speeds with no additional safeguards in place either has an unrealistic understanding of the tech or is a sociopath.

  • Lukeazade@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    3 hours ago

    Coming here because I saw how downvoted this post was on Reddit lol. I love that it’s triggering the Elon fanboys.

  • slaacaa@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    5 hours ago

    Thank god it doesn’t have LIDAR sensors, much cheaper to repair the front this way

    Tap for spoiler

    /s

  • RickC137@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    11
    ·
    5 hours ago

    I am not a fan of Tesla/Elon but are you sure that no human driver would fall for this?

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      5 minutes ago

      Lets assume that a human driver would fall for it, for sale of argument.

      Would that make it a good idea to potentially run over a kid just because a human would have as well, when we have a decent option to do better than human senses?

      • RickC137@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        1 hour ago

        What makes you assume that a vision based system performs worse than the average human? Or that it can’t be 20 times safer?

        I think the main reason to go vision-only is the software complexity of merging mixed sensor data. Radar or Lidar alone also have their limitations.

        I wish it was a different company or that Musk would sell Tesla. But I think they are the closest to reaching full autonomy. Let’s see how it goes when FSD launches this year.

    • undeffeined@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      3 hours ago

      The road runner thing seems a bit far fetched yeah. But there were also tests with heavy rain and fog which were not passed by Tesla.

      • Ghostalmedia@lemmy.worldOPM
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        The road runner thing isn’t far fetched. Teslas have a track record of t-boning semi trucks in overcast conditions, where the sky matches the color of the truck’s container.

      • RickC137@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 hour ago

        Should be fine if the car reduces speed to account for the conditions. Just like a human driver does.

      • oplkill@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 hours ago

        Isnt there a rule if weather very heavy and you cant see you must stop driving immediately

        • undeffeined@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          3 hours ago

          You mean a traffic rule? I can’t comment about the US but in Portugal I don’t recall such a rule when learning to drive. Also in Finland I have not experienced that since traffic keeps going even in heavy blizzards.

      • chetradley@lemm.ee
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        14 hours ago

        Getting into a legal battle with an immensely popular YouTuber would probably cost them a lot more in bad publicity than they would reasonably make from a lawsuit. I guarantee someone at Disney is doing or already has done the calculations.

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    19 hours ago

    Insane that the telsa drives into spaces its unsure of. So dangerous

  • harryprayiv@infosec.pub
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    5
    ·
    edit-2
    13 hours ago

    I’ve been shit-talking Elon’s (absolutely boneheaded) decision to intentionally eschew system-redundancy in systems that are critically responsible for human life for years now. Since he never missed an opportunity to show off his swastikar in MANY of his previous videos, I had assumed Mark Rober was a sponsored member of the alt-right intellectual dark web. But I’m pleasantly surprised to see that this video is a solid (WELL-justified) smear. 👌

    • imvii@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I had assumed Mark Rober was a sponsored member of the alt-right intellectual dark web.

      He is.

  • sir_pronoun@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    18 hours ago

    What about the claims that he only used Autopilot, and not Tesla’s Full Self Driving?

    (Context: I hate Tesla, just curious for the sake of an honest argument)

    • Manalith@midwest.social
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      Philip DeFranco had him on Yesterday and he said the reason he didn’t use FSD was that it required you to input an address, but that there isn’t any difference in terms of the sensors being used.

      Given that the other car didn’t appear to have a version of FSD either, I’m unclear as to why Autopilot wasn’t the correct move for the most accurate comparison.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      17 hours ago

      Not any tangible difference in this scenario. Both use vision only. And both use the same computers.

      • sir_pronoun@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        But do they use a different software? Maybe FSD is more advanced than autopilot and could have reacted better?

        Just playing devil’s advocate here.

        • undefinedValue@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          9 minutes ago

          The software may change but these tests show it’s the hardware that’s limiting them. If the Tesla can’t see a kid through fog, it doesn’t matter what software you pick, that kid gunna die.

    • pelespirit@sh.itjust.works
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      18 hours ago

      He was helping out Tesla by doing that. He was helping them get the wins they got instead of just Tesla massacring the kid every time. Note to self: As a pedestrian and you see a tesla, don’t cross the street.