(no title)
angusb
|
8 months ago
re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver
fabian2k|8 months ago
This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.
angusb|8 months ago
I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.
I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!
locococo|8 months ago
reaperducer|8 months ago
Ma'am, we're sorry your little girl got splattered all over the road by a billionaire's toy. But, hey, sampling errors happen.
dzhiurgis|8 months ago
sokoloff|8 months ago
If the Tesla can't stop for the bus (not the kid) in 12 car lengths, that's not p-hacking, that's Tesla FSD being both unlawful and obviously unsafe.