top | item 44288392

(no title)

angusb | 8 months ago

re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver

discuss

order

fabian2k|8 months ago

I absolutely do think that self-driving cars will save many lives in the long run. But I also think it is entirely fair to focus on the big, visible mistakes right now.

This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.

angusb|8 months ago

I agree it's a major mistake + should get a lot of focus from the FSD team. I'm just unsure whether that directly translates to prohibiting a robotaxi rollout (I'm open to the possibility it should though).

I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.

I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!

locococo|8 months ago

No! Ignoring a stop sign is such a basic driving standard that it's an automatic disqualification. A driver that misses a stop sign would not have my kids in their car. They could be the safest driver on the racetrack it does not matter at that point.

reaperducer|8 months ago

I agree with that, but remember there's sampling error.

Ma'am, we're sorry your little girl got splattered all over the road by a billionaire's toy. But, hey, sampling errors happen.

dzhiurgis|8 months ago

Also they've repeatedly tested closer and closer distances until Tesla failed aka p-hacking.

sokoloff|8 months ago

In the video (starting at ~13 seconds), the Tesla is at least 16 and probably 20 car lengths from the back of the bus with the bus red flashing lights on the entire time.

If the Tesla can't stop for the bus (not the kid) in 12 car lengths, that's not p-hacking, that's Tesla FSD being both unlawful and obviously unsafe.