(no title)
frabjoused | 1 year ago
When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.
But at the end of the day, only the numbers matter.
frabjoused | 1 year ago
When I did the trial on my Tesla, I also noted these kinds of things and felt like I had to take control.
But at the end of the day, only the numbers matter.
timabdulla|1 year ago
Even if it is true that the data show that with FSD (not Autopilot) enabled, drivers are in fewer crashes, I would be worried about other confounding factors.
For instance, I would assume that drivers are more likely to engage FSD in situations of lower complexity (less traffic, little construction or other impediments, overall lesser traffic flow control complexity, etc.) I also believe that at least initially, Tesla only released FSD to drivers with high safety scores relative to their total driver base, another obvious confounding factor.
Happy to be proven wrong though if you have a link to a recent study that goes through all of this.
valval|1 year ago
[deleted]
rvnx|1 year ago
You have a Mercedes, Mercedes takes responsibility.
You have a Tesla, you take the responsibility.
Says a lot.
sebzim4500|1 year ago
Technically, that is the easiest way to get a perfect safety record and journalists will seemingly just go along with the charade.
tensor|1 year ago
diebeforei485|1 year ago
_ea1k|1 year ago
Maybe that could be a problem with future versions, but I don't see it happening with 12.3.x. I've also heard that driver attention monitoring is pretty good in the later versions, but I have no first hand experience yet.
valval|1 year ago
I wonder how things are inside your head. Are you ignorant or affected by some strong bias?
kelnos|1 year ago
How often does an autonomous driving system get the driver into a dicey situation, but the driver notices the bad behavior, takes control, and avoids a crash? I don't think we have publicly-available data on that at all.
You admit that you ran into some of these sorts of situations during your trial. Those situations are unacceptable. An autonomous driving system should be safer than a human driver, and should not make mistakes that a human driver would not make.
Despite all the YouTube videos out there of people doing unsafe things with Tesla FSD, I expect that most people that use it are pretty responsible, are paying attention, and are ready to take over if they notice FSD doing something wrong. But if people need to do that, it's not a safe, successful autonomous driving system. Safety means everyone can watch TV, mess around on their phone, or even take a nap, and we still end up with a lower crash rate than with human drivers.
The numbers that are available can't tell us if that would be the case. My belief is that we're absolutely not there.
bastawhiz|1 year ago
rvnx|1 year ago
Uzza|1 year ago
gamblor956|1 year ago
Also, Tesla is known to disable self-driving features right before collisions to give the appearance of driver fault.
And the coup de grace: if Tesla's own data showed that FSD was actually safer, they'd be shouting it from the moon, using that data to get self-driving permits in CA, and offering to assume liability if FSD actually caused an accident (like Mercedes does with its self driving system).
nkrisc|1 year ago
ForHackernews|1 year ago
akira2501|1 year ago
lawn|1 year ago
Oh? Who are presenting the numbers?
Is a crash that fails to trigger the airbags still not counted as a crash?
What about the car turning off FSD right before a crash?
How about adjusting for factors such as age of driver and the type of miles driven?
The numbers don't make sense because they're not good comparisons and are made to make Tesla look good.
johnneville|1 year ago
for whatever does exist, it is also easy to imagine how they could be misleading. for instance i've disengaged FSD when i noticed i was about to be in an accident. if i couldn't recover in time, the accident would not be when FSD is on and depending on the metric, would not be reported as a FSD induced accident.
kybernetikos|1 year ago
Are these the numbers reported by tesla, or by some third party?
throwaway562if1|1 year ago
Uzza|1 year ago
concordDance|1 year ago
"To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact"
https://www.tesla.com/en_gb/VehicleSafetyReport
Situations which inevitably cause a crash more than 5 seconds later seem like they would be extremely rare.