top | item 45332643

(no title)

SillyUsername | 5 months ago

A lot of apologists say that "a human would have hit that".

That's kind of irrelevant, this technology is meant to be safer and held to a higher standard.

Comparing to a human is not a valid excuse...

discuss

order

littlecranky67|5 months ago

> That's kind of irrelevant, this technology is meant to be safer and held to a higher standard

I don't think that is the case. We will judge FSD whether you make more or less accidents than humans, not necessarily in the same situations. The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.

Given that >90% of accidents are easily avoidable (speeding, not keeping enough safety distance, drunk/tired driving, distraction due to smartphone usages) I think we will see FSD be safer on average very quickly.

breve|5 months ago

> I don't think that is the case.

It's the standard Tesla set for themselves.

In 2016 Tesla claimed every Tesla car being produced had "the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver": https://web.archive.org/web/20161020091022/https://tesla.com...

Wasn't true then, still isn't true now.

sumeno|5 months ago

> I think we will see FSD be safer on average very quickly.

This is what Musk has been claiming for almost a decade at this point and yet here we are

piva00|5 months ago

> The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.

This subverts all of the accumulated experience of other users on the road about what a car will do, everyone is used to potential issues caused by humans, on top of that other road users will have to learn the quirks of FSD to keep an eye for abnormalities in behaviour?

That's just unrealistic, not only people will have to deal with what other drivers can throw at them (e.g.: veering off lane due to inattention) but also be careful around Teslas which can phantom brake out of nowhere, not avoid debris (shooting it on unpredictable paths), etc.

I don't think we should accept new failure modes on the road for FSD, requiring everyone else to learn them to be on alert, it's just a lot more cognitive load...

ACCount37|5 months ago

That's the main advantage self-driving has over humans now.

A self-driving car of today still underperforms the top of the line human driver - but it sure outperforms the "0.1% worst case": the dumbest most inebriated sleep deprived and distracted reckless driver that's responsible for the vast majority of severe road accidents.

Statistics show it plain and clear: self-driving cars already get into less accidents than humans, and the accidents they get into are much less severe too. Their performance is consistently mediocre. Being unable to drink and drive is a big part of where their safety edge comes from.

cosmicgadget|5 months ago

I don't think the decision should be or will be made based on a single axis.

Lionga|5 months ago

A human would not have hit that, the two guys see it coming from a long time and would have stopped or changed lanes like.

jjav|5 months ago

> That's kind of irrelevant, this technology is meant to be safer and held to a higher standard.

True, but not even relevant to this specific example. Since the humans clearly saw it and would not have hit it, so we have a very clear example where Tesla is far inferior to humans.

keyle|5 months ago

Indeed... You can see the driver reaching for the wheel, presumably he saw it coming, and would have hit the breaks. He left the car to do its thing thinking it knows better than him... maybe.

jjav|5 months ago

> presumably he saw it coming

Not presumably, we know for sure since they are talking about it for a long time before impact.

The point of the experiment was to let the car drive so they let it drive and crash, but we know the humans saw it.

HarHarVeryFunny|5 months ago

Personally if the road was empty as here, I'd have steered around it.

This was really best possible driving conditions - bright day, straight dry road, no other cars around, and still it either failed to see it, or chose to run over it rather than steering around it or stopping. Of all the random things that could happen on the road, encountering a bit of debris under ideal driving conditions seems like it should be the sort of thing it would handle better.

And yet Tesla is rolling out robo taxis with issues like this still present.