Assuming the news is true that nobody sat (before the crash) in the driver seat, why not start with the autopilot being potentially faulty?
We don’t have to get into what they call it:
- Tesla is selling “Full Self-Driving Capability” today.
- They’ve begun Full Self-Driving" as a beta months ago.
- Your tweeting overlord has stated “We could have gamed an LA/NY Autopilot journey last year, but when we do it this year, everyone with Tesla Full Self-Driving will be able to do it too”
- …and gone on record saying “I think Autopilot’s getting good enough that you won’t need to drive most of the time unless you really want to.”
- Tesla drivers (actually being-driveners) are putting out videos that show Full Self-Driving with no hands on the wheel today. EDIT: Better yet, see this one on Youtube.
They’re making people believe (even though they do state otherwise in fine print) that it’s safe and ready today - well, practically, if not to regulators.
…which is what the police are saying, it’s dangerous.
In the end, without getting into technicalities, it pretty simple:
- Tesla and Musk are touting their Autopilot and autonomous driving features as having practically achieved full self-driving today
- while covering themselves only in the fine print and/or small technical nuances (e.g. claiming they’re only selectively enabling self-driving “features”)
- they allow their vehicles to drive autonomously in conditions that might not be appropriate
- while not enforcing a hands-on-the-wheel policy strictly enough.
Interesting point of view of a first-responder regarding the rumors of the incident but also the new challenges to deal with when you find an ev on fire.
The bigger picture:
I’m sure, and I don’t think anyone questions this, that even with the best FSD imaginable there will still be accidents and even fatal ones. The question is whether they will be fewer than today without FSD, and I think we can agree on that.
I think a few forum members are guilty of this negative mindset. “This ridiculous >>autopilot<<, such an incorrect name, puts people in danger. Look, people are dying because of it. Tesla is bad! Who needs this much horsepower? This big screen is a huge distraction!”. And so on.
The goal is to save lives and what other way to get there, than through extensive beta testing and improvement? So far there have been no reports on fatal accidents involving FSD beta, which is fantastic.
Well, “autopilot” is an incorrect name. We can discuss about the rest
…
Let’s discuss it: i can imagine that the average driver is not as informed as the average pilot about the limitations of the system?
Please refer to my comment from above or the original wikipedia article. Please inform yourself about the origin of the word autopilot (comes from aviation) and what it does in real life. Just because you interpret it in your own way, doesn’t mean the term in itself is misleading.
Worth reading too: Hypothesis of the Houston Tesla crash.
https://www.facebook.com/1648385819/posts/10222348601377739/?d=n
If you let off the accelerator they begin to brake using the motor to slow the car and regenerate electricity
Actually, there are 3 settings: full regen, reduced regen and no regen. You can turn regen braking off. Also, a fully charged battery will also disable regen. Cold weather too. It can make for an unpleasant ride once you’re used to regen and then one day it doesn’t work. I normally don’t use the brake thanks to regen. Otherwise I don’t think this post brings much to the story, just some thoughts of a guy.
You’re right it doesn’t add anything to the story, I just put it in to highlight the fact that the average driver may not understand the functionality and/or has incorrect expectations, such as holding accelerated and crashing.
Pilots’ and drivers’ training levels are extremely different. Side note: most drivers are not professional drivers.
Side note: all pilots are licensed pilots and all drivers are licensed drivers. OK, controlling autopilot is not part of the driving exam, and probably never will, as we will sooner reach full autonomy. But who knows, maybe for the few decades where human-driven and self-driven cars co-exist, drivers will have to show that they understand the smart features of the car.
But aren’t we talking about some edge cases here? Someone somewhere is always bound to fatally misinterpret how things work. It doesn’t make World news every time. Someone walks into an empty elevator shaft, someone smokes a cigarette at a gas station. This will maybe get some local coverage. But in case of bad Tesla news, this provides validation to all Tesla haters, so it’s news that is really sought after.
I’d be interested in the opinion of someone who has experience as a Tesla driver and as a plane pilot.
And also much less stuff happening in the air. No dog/person whatsoever will jump infront of your plane once at cruising altitude…
I was about to craft a reply with my usual style of multiple quotes but I discarded it (edit: and it’s not meant to you personally SwissTeslaBull).
Whatever the heck you call the autonomous and assistance features in play or not here:
FSD or no FSD? Autopilot or no Autopilot? What does it actually do, and how are they advertising it to their customers? Am I biased, or are the Tesla haters, and just looking for confirmation?
You can try spinning that all that how you want.
But ultimately, we really don’t need to get sidetracked by such technicalities here.
The simple fact is: According to media reports and police statements the vehicle crashed at high speed, driving over grass, with (apparently) no one in the driver seat.
If these reports are indeed true, any vehicle allowing for such behaviour is dangerous.
I gave up discussing Autopilot with people that didn’t even bother to test drive a Tesla ever. Mostly just envious, sad folks, upset that their legacy car sucks.
Driver was not found in the driver’s seat after the crash. Have you ever seen some crash tests (or actual accidents) without seatbelts? Bodies are flying around.
You are so prejudiced, it’s sick. You can put a brick on the pedal of a car and it will also drive without anyone in the driver’s seat. Let’s ban kitchen knives, they can be used to kill! Let’s sue McDonalds for not writing that the tea is hot and can cause skin burn!
Are we going to treat people as responsible beings or are we going to ban everything that could be used incorrectly?
These are isolated cases, we hear about every deadly Tesla crash, doesn’t matter if Autopilot was involved. Now, if there were hundreds of documented Autopilot-abuse caused crashes, it would be reasonable to say: ok, we need to control it better. But it’s not the case. You only enjoy news like this, because it provides you with psychological validation of your dislike towards Tesla.
For the record, I don’t drive a legacy car, nor a newfangled Tesla.
Neither do I plan to. So I’m not envious about Tesla cars or any other cars.
…usually they don’t magically end up in the passenger and rear seat.
Yes, you’re probably right there.
And I’ll tell you why.
I don’t like too much hyperbole, empty promises and plain …corporate bullshit.
Especially if others or the public are taking it at face value.
I also don’t like corporate irresponsibility and duplicitousness.
They’re blurring the line between driver-assistance features and fully autonomous driving. They’re again and again claiming they’ve practically achieved the latter and it is ready to go, going to far as to sell that to customers today - while backtracking from such claims in the “fine print”. They’re thereby endangering their (non-professional driving) customers and others - irrespective of if their system saves lives “elsewhere”, on other occasions.
And I believe they’re doing it deliberately, to push sales of their cars and prop up their stock price.
And that’s why I don’t like them.
I’m not against autonomous driving or assistance features.I wouldn’t be bothered by them taking the automotive industry from their Asian and European competitors or something. If they behaved like a responsible company and/or CEO.


