Human driven cars cost people their lives multiple times every day, though. So I don't think the calculation can be quite as simple as that. As self driving cars are rolled out I think each incident like this needs to be studied to see how avoidable it was, whether a human would have been able to resolve it, and what changes can be made.
There are always going to be fuck ups at some level. The question is whether we’re moving from a world of more fuckups to fewer or not.
The argument that as long as they cause less incidents than human drivers they are a win has to go. Because that only works if the statistics of the environment are stationary.
But in the case of self driving cars, who do we find at fault? Have we even answered that question? I mean did the Waymo car even get a ticket for blocking the ambulance?
If you are to believe Waymo’s safety stats, they have less accidents/injuries per mile driven.
But whether or not reducing injuries at a statistical level outweighs the downside of autonomous vehicles causing accidents (even at lower rates) is a bit of a dilemma.
The human side of those stats, whenever I've seen them presented next to self-driving car stats, has always been an aggregate of all human driving, a vast amount of which is in environments or conditions that Waymo doesn't operate in.
I think the style of incidents and circumstances are probably neglected. But, even if they're not, I think there's other reasons we notice waymo issues more. Akin to how nuclear and airplane travel are safer than coal and car travel. This might be true, but when something does go well in those aforementioned fields, we notice.
Now there will be a single company to sue instead of lots of individuals. If you want to be rich, start a law firm that focuses on autonomous vehicle accidents, like all the truck crash firms out there.
No, we’re finding edge cases that come up once every like million miles these things are putting on the road. Which means they are pretty damn good given how many are on the road right now.
These "edge cases" were required knowledge to get a license in my home country. You make room for any emergency vehicles, you don't try to score an ultra kill when passing a school bus and you certainly don't drive on rail tracks.
afavour|17 hours ago
There are always going to be fuck ups at some level. The question is whether we’re moving from a world of more fuckups to fewer or not.
amelius|17 hours ago
Noaidi|17 hours ago
jeffbee|17 hours ago
cj|17 hours ago
But whether or not reducing injuries at a statistical level outweighs the downside of autonomous vehicles causing accidents (even at lower rates) is a bit of a dilemma.
furyofantares|17 hours ago
SecretDreams|17 hours ago
screye|17 hours ago
stbtrax|17 hours ago
xvxvx|16 hours ago
dyauspitr|17 hours ago
josefx|8 hours ago