top | item 44288624

(no title)

angusb | 8 months ago

I agree it's a major mistake + should get a lot of focus from the FSD team. I'm just unsure whether that directly translates to prohibiting a robotaxi rollout (I'm open to the possibility it should though).

I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.

I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!

discuss

order

ethbr1|8 months ago

Zero systemic, reproducible mistakes is the only acceptable criteria.

Do you really want to trust a heartless, profit-motivated corporation with 'better than human is good enough'?

What happens when Tesla decides they don't want to invest in additional mistake mitigation, because it's incompatible with their next product release?

Reubachi|8 months ago

Caveat/preface to prevent trolls: FSD is a sham and money grab at best, death trap at worst, etc.

But, I've read through your chain of rplies to OP and maybe I can help with my POV.

OP is replying in good faith showing "this sampling incident is out of scope of production testing/cars for several reasons, all greatly skewing the testing from this known bad actor source."

And you reply with "Zero systemic reproducible mistakes is the only acceptable critera."

Well then, you should know, that is the current situation. In tesla testing, they achieve this. The "test" in this article, which the OP is pointing out, is not a standardized test via Tesla on current platforms. SO be careful with your ultimatums, or you might give the corporation a green light to say "look! we tested it!".

I am not a tesla fan. However, I also am aware that yesterday, thousands of people across the world where mowed down by human operators.

If I put out a test video showing that a human runs over another human with minimum circumstances met, IE; rain, distraction, tires, density, etc., would you call for a halt on all human driving? Of course not, you'd investigate the root cause, which is most of the time, distracted or impaired driving.

angusb|8 months ago

> Do you really want to trust...

No, but the regulator helps here - they do their own independent evaluation

> What happens when Tesla decides...

the regulator should pressure them for improvements and suspend licenses for self driving services that don't improve

spwa4|8 months ago

It should get a lot of focus from the regulator, not "the FSD team".

ethbr1|8 months ago

Tesla and this administration operate under the Boeing model: surely the manufacturer knows best.

angusb|8 months ago

agreed - I said FSD team to distinguish from "the crowds" but this was the wrong wording, should be the regulator too.