It’s funny how some of Elongated Muskrat’s testing and experiments involve the subjects dying.
Monkeys dying with the Neuralink experiments, and humans are dying with these autopilot tests!
FSD, maybe. But autopilot operates fine and is no different than what most major manufacturers offer.
Edit: Lots of people that have never used Tesla or other manufacturers lane keeping systems I see.
No.
I own a model 3 and a 2022 palisade with Lane assist and used to own a Subaru with Lane assist.
The model 3 auto steer, exit to exit EAP, and auto lane change are very different than the simple lane assist that either other cars offer and honestly after using EAP for five years, while I do use AP under specific circumstances, I have come to the opinion that it is not ready for prime time and has some major issues, especially the a uto lane changing, that should have been worked out before release and I still never use that feature.
Given my background in embedded software, I honestly think the way they rolled out and advertised these features was reckless.
My vehicle can do almost all the same stuff as “autopilot” but it turns the autosteering and cruise off if I dont touch the wheel every 30 seconds. Its all the same types of sensors,etc. And mine isn’t even a luxury brand. Just the higher end trim package of a budget vehicle.
edit: actually, it’s just 10 seconds before the warning and another 5 or so before it disables lane-keeping
Autopilot also shuts off with no driver input. Faster than 30 seconds too.
What is your point
I made my point in my comment (not that it was anything earth shattering.)
What’s yours?
Nevermind, I don’t give a fuck.
I made my point in my comment
I don’t understand. You just replied to this person to brag about your car?
What’s yours?
I didn’t make a point, I asked a question.
Last time I tried autopilot was 4 years ago, so I imagine things have become better. That said, on a test drive, on a rainy day, auto lane change did some fighting stuff. Thought lanes were clear, learned they weren’t, then violently ripped the car back to the origin lane in conditions that were prime for hydroplaning.
My wife and I were scared shitless, and the woman from Telsa, who was also in the car, tried to reassure us by saying “it’s ok, this is normal.”
Then we return the car to the parking lot and auto park almost took out a kid in an enclosed parking structure.
I imagine it’s become better in 4 years, but how that was street legal baffled my mind.
None of what you mentioned is in basic autopilot. Autopilot is lane keep and traffic aware cruise control only.
Let’s not get pedantic. They are part of the “enhanced autopilot” package.
If these were called “cruise control”, “adaptive cruise control”, and “Blue Cruise” would it matter if the article said “cruise control” but was referring to “Blue Cruise”?
Tesla’s names for these things are “Autopilot”, “Enhanced Autopilot”, and “FSD Beta”.
At the very least, the names matter so that we can all agree we’re talking about the same things.
Auto lane change is not a function of Autopilot
Yes they are. There are two tiers of autopilot functionality. Basic and Advanced. This is part of the Advanced Autopilot tier.
https://www.tesla.com/support/autopilot
Telsa refers to those features as “autopilot”, and this former employee is referring those features as “autopilot” in his whistle blower claims.
It’s called “Enhanced Autopilot” and is distinctly different from “AutoPilot”.
This is like arguing that an iPhone Pro isn’t a “iPhone,” it’s a “iPhone Pro.”
Call it whatever you want. This whistle blower, the press, and this comment thread are all referring to unsafe features of Tesla’s L2 automation that are currently available to the public.
This is like arguing that an iPhone Pro isn’t a “iPhone,” it’s a “iPhone Pro.”
Yes it is, and in certain contexts (such as this one) it is very important. Especially considering that Autopilot has been installed on every vehicle made in the last several years and Enhanced Autopilot will be in practically zero.
This whistle blower, the press, and this comment thread are all referring to unsafe features of Tesla’s L2 automation that are currently available to the public.
According to whom? Nothing in the OP title, the OP article or the BBC piece they robbed the story from indicates any of that.
Enhanced Autopilot is very popular. All the hardware is already installed on the car, it just needs to be unlocked by purchasing the subscription in the app. The Full Self Driving package is also unlockable via a software subscription. FSB will be out of beta soon, but advanced autopilot has been a popular purchase for many years. It’s one of the main reasons people buy a Telsa. It is most definitely not on “practically zero” Teslas.
As for “according to whom” - you replied to my comment about my experience with autopilot. So according to me.
Advanced autopilot did some frightening stuff during the little time I spent driving a model 3. I really wanted to like the model 3 and was expecting to whip out my checkbook, but that test drive scared the shit out of my wife and I. It made some very dangerous lane changes and the autonomous parking stuff almost hit a kid in a parking lot. The latter is definitely widely reported. I’m not the only person to have experienced that problem.
Random question I’ve always wondered about in case anyone is more familiar with these systems than me. My understanding is that autopilot relies on optical sensors exclusively. And image recognition tends to rely on getting loads of data to recognize particular objects. But what if there’s an object not in the training data, like a boulder in a weird shape? Can autopilot tell anything is there at all?
Yeah obstructions can be generalized to a road being blocked. Object recognition includes recognizing the shape of an object via curves, shadows, depth, etc. You don’t need to know it’s a boulder to know a large object is in the road.
Something like snow? The vehicle ignores what to do or where to go.
deleted by creator
I’ve put like 10k miles on AutoPilot. It’s not an experiment.
Edit: before anyone goes and reads all of this, I’ll sum it up:
This is a textbook argument from anecdote. They presented their anecdotal evidence and conclusion implying that because they’ve done 10k miles, it follows that Autopilot accidents shouldn’t be this big of a deal (blown out of proportion). A scalding hot garbage take. I got a little emotionally off topic with cause of deaths in a fallacious appeal to emotion.
Ah yes, “I’ve personally done this and I’m the most important, therefore it’s not true. It’s the experts and engineers who are wrong.”
Besides “everybody look at me!!”, what is your point?
It’s not that I know better than them. It’s that they are completely blowing things out of proportion.
This is the best summary I could come up with:
“In late 2021, Lukasz realised that—even as a service technician—he had access to a shockingly wide range of internal data at Tesla,” the group’s prize announcement said.
Krupski was also featured last month in a New York Times article titled, “Man vs. Musk: A Whistleblower Creates Headaches for Tesla.”
But Krupski now says that “he was harassed, threatened and eventually fired after complaining about what he considered grave safety problems at his workplace near Oslo,” the NYT report said.
Krupski “was part of a crew that helped prepare Teslas for buyers but became so frustrated with the company that last year he handed over reams of data from the carmaker’s computer system to Handelsblatt, a German business newspaper,” the report said.
The data Krupski leaked included lists of employees and personal information, as well as “thousands of accident reports and other internal Tesla communications.”
Krupski told the NYT that he was interviewed by the NHTSA several times, and has provided information to the US Securities and Exchange Commission about Tesla’s accounting practices.
The original article contains 705 words, the summary contains 172 words. Saved 76%. I’m a bot and I’m open source!
I always assumed that was the case. We are still experimenting with human drivers.
Non-consentual Human Experimentation is a war crime.
It’s consentual if you buy it though.
Calling it a war crime is slightly extreme.
If you hit another motorist or pedestrian, it’s no longer consensual.
War crime is a tad much sure. Let’s just make it a felony.
Except the other drivers on the road aren’t all in Teslas, yet they are non-consentually and possibly even unknowingly a part of this experiment.
It’s peace time though so it doesn’t qualify
/s