A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

    • LePoisson@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      It’s fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don’t stop for pedestrians or drive off a cliff. So freaking what, that’s the price for progress my friend!

      I’d like to think this is unnecessary but just in case here’s a /s for y’all.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      2
      ·
      2 hours ago

      GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    5 hours ago

    The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

    What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 hours ago

      Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.

      What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

        Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

        It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          1
          ·
          2 hours ago

          …is literally never going to get any better it’s always going to be this bad.

          Hey now! That’s unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won’t know which until it tries to kill you in new and unexpected ways :j

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        3 hours ago

        To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          2
          ·
          2 hours ago

          Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.

          Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Imagine if people treated airbags that way XD

      If Ford airbags just plain worked, and then Tesla airbags worked 999 times out of 1,000, would the correct answer be to say “well thems the breaks, there is no room for improvement, because dangerously flawed airbags are way safer than no airbags at all.”

      Like, no. No, no, no. Cars get recalled for flaws that are SO MUCH less dangerous.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    11 hours ago

    I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

  • itisileclerk@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    13 hours ago

    Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.

  • tfm@europe.pub
    link
    fedilink
    English
    arrow-up
    8
    ·
    13 hours ago

    “I’m confident that Save full self driving (SFSD) will be ready next year”

    • TeddE@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 hours ago

      Look, I respect where you’re coming from. May I presume your line of reasoning is in the vein of “elon musk sucks and thus anyone who buys their stuff is a Nazi and should die” - but that is far, far too loose of a chain of logic to justify sending a man to death alone. Perhaps if you said that they should be held accountable with the death penalty on the table? But c’mon - are you really the callous monster your comment paints you as?

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        3 hours ago

        These aren’t passive victims, they are operating harmfully dangerous machines at high speeds on roads shared with the rest of us.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          2 hours ago

          Right but they believe that the car is safe, because of the advertising and because the product is legally sold.

          If anyone is to blame here it’s not the owner of the car, it’s the regulators who allow such a dangerous vehicle to exist and to be sold.

          • KayLeadfoot@fedia.ioOP
            link
            fedilink
            arrow-up
            1
            ·
            2 hours ago

            Yea, this subthread it morally ass.

            I don’t think it’s morally wrong to be a sucker. If you fall for the lie, you think you’re actually doing a good thing by using FSD and making the road both safer today and potentially radically safer into the future.

            Problem is, it’s a lie. Regulators exist to sort that shit out for you, car accidents are rare enough that the risk is hard to evaluate as a lone-gun human out here. The regulators biffed this one about as hard as an obvious danger can be biffed.

  • orca@orcas.enjoying.yachts
    link
    fedilink
    English
    arrow-up
    82
    arrow-down
    1
    ·
    21 hours ago

    The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 hours ago

      I wouldn’t really called it a solved problem when waymo with lidar is crashing into physical objects

      https://www.msn.com/en-us/autos/news/waymo-recalls-1200-robotaxis-after-cars-crash-into-chains-gates-and-utility-poles/ar-AA1EMVTF

      NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.

      It’d probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.

      Just because you see a car working perfectly, doesn’t mean it always is working perfectly.

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        The same is true when you put a cone in front of a human driver’s vision. I don’t understand why “haha I blocked the vision of a driver and they stopped driving” is a gotcha.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        19
        ·
        13 hours ago

        Probably Zoox, but conceptually similar, LiDAR backed.

        You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)

        Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.

        • SynopsisTantilize@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          ·
          13 hours ago

          That and if you just put your toddler on the roof of the car or something or trunk for a quick second to grab something from your pocket…VROooOMMM baby gone.