A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.
What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?
Well, because 99% of the time, it’s fairly decent. That 1%'ll getchya tho.
To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.
Typical piss poor quality from Leon Hitler. F Tesla.
It got the most recent update, and thought a tunnel was a wall.
… and a tree was a painting.
Took me a second to get it, but that’s brilliant.
I wonder if there might even be some truth to it?
HAL9000 had Oh Clementine!
Has Tesla been training their AI with the lumberjack song?
It’s full self-driving, doesn’t need roads. Put into wrong car.
Don’t drive Tesla
I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s
That tree cast shade on his brand.
It had to go.
Why would you inflict that guy on a poor innocent kitty?
No serious injuries
How unfortunate
Look, I respect where you’re coming from. May I presume your line of reasoning is in the vein of “elon musk sucks and thus anyone who buys their stuff is a Nazi and should die” - but that is far, far too loose of a chain of logic to justify sending a man to death alone. Perhaps if you said that they should be held accountable with the death penalty on the table? But c’mon - are you really the callous monster your comment paints you as?
These aren’t passive victims, they are operating harmfully dangerous machines at high speeds on roads shared with the rest of us.
Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.
I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.
“I’m confident that Save full self driving (SFSD) will be ready next year”
The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.
I wouldn’t really called it a solved problem when waymo with lidar is crashing into physical objects
NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.
It’d probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.
Just because you see a car working perfectly, doesn’t mean it always is working perfectly.
Are those the ones that you can completely immobilize with a traffic cone?
The same is true when you put a cone in front of a human driver’s vision. I don’t understand why “haha I blocked the vision of a driver and they stopped driving” is a gotcha.
You say that like it’s a bad thing lol if it kept going, that cone would fly off and hit somebody.
Probably Zoox, but conceptually similar, LiDAR backed.
You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)
Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.
That and if you just put your toddler on the roof of the car or something or trunk for a quick second to grab something from your pocket…VROooOMMM baby gone.
Yes lol
I mean, if Elon was my dad, I’d probably have some suicidal tendencies too.
More like the abusive step-father
I use autopilot all the time on my boat. No way in hell I’d trust it in a car. They all occasionally get suicidal. Mine likes to lull you into a sense of false security, then take a sharp turn into a channel marker or cargo ship at the last second.
They have auto pilot on boats? I never even thought about that existing. Makes sense, just never heard of it until just now!
They’ve technically had autopilots for over a century, the first one was the oil tanker J.A Moffett in 1920. Though the main purpose of it is to keep the vessel going dead straight as otherwise wind and currents turn it, so using modern car terms I think it would be more accurate to say they have lane assist? Commercial ones can often do waypoint navigation, following a set route on a map, but I don’t think that’s very common on personal vessels.
They’ve had it forever. Tie a rope to the wheel. Presto. Autopilot.
I’ll point this post out to Wall Street Bets, Maersk stock will pop 10%+ overnight.
That’s not how boats (ouside of hollywood) work, tho
Exactly. My car doesn’t have AP, but it does have a shed load of sensors and sometimes it just freaks out about stuff being too close to car for no discernible reason. Really freaks me out as I’m like what you see bro we just driving down the motorway.
For mine, it’s the radar seeing the retro-reflective stripes on utility poles being brighter than it expects.
Isn’t there a plane whose autopilot famously keeps trying to crash into the ground. The general advice is to just not let it do that, whenever it looks like it’s about to crash into the ground, pull up instead.
All the other answers here are wrong. It was the Boeing 737-Max.
They fit bigger, more fuel efficient engines on it that changed the flight characteristics, compared to previous 737s. And so rather than have pilots recertify on this as a new model (lots of flight hours, can’t switch back), they designed software to basically make the aircraft seem to behave like the old model.
And so a bug in the cheaper version of the software, combined with a faulty sensor, would cause the software to take over and try to override the pilots and dive downward instead of pulling up. Two crashes happened within 5 months, to aircraft that were pretty much brand new.
It was grounded for a while as Boeing fixed the software and hardware issues, and, more importantly, updated all the training and reference materials for pilots so that they were aware of this basically secret setting that could kill everyone.
The Being 787 Max did that when the sensor got faulty and there was no redundancy for the sensor’s because that was in an optional addon package
Even worse, the pilots and the airlines didn’t even know the sensor or associated software control existed and could do that.
Pretty sure that’s the Boeing 777 and they discovered that after a crash off Brazil.
“It crashed!”
“Yes but it did it all by itself!”
Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.