top | item 23163628

(no title)

Guest0918231 | 5 years ago

I don't think it's a fair comparison.

There's a difference between someone crashing their own Volvo, and someone sitting in the backseat of a Waymo that misinterprets the lane markers and crashes into a barrier. One is not news, one is front page news.

I guarantee the first Waymo death is going to be publicized everywhere. It doesn't need to be a trend, and as I said, it doesn't mean the cars are more dangerous than human drivers. However, it's going to be news. There are going to be all sorts of moral debates when an algorithm decides to drive over a child instead of turning into an oncoming car. People will want answers. How does Waymo rank the value of different lives? I imagine it'll be news for the next decade until there are thousands of deaths and it becomes normal.

It's an industry that's going to be full of "firsts" and that's going to be the news. Waymo drives over dog while auto-piloting to a parking lot. Waymo mistakes grocery cart for stroller and swerves into elderly man. Waymo kills cyclist when poor weather disrupts censors. It doesn't matter if they ship something safe. Get enough cars on the road and these things will happen and people will be talking about it.

discuss

order

tialaramex|5 years ago

> There are going to be all sorts of moral debates when an algorithm decides to drive over a child instead of turning into an oncoming car.

No. That's a trolley problem. It's an interesting intellectual exercise, you can maybe win a debate team trophy for your rousing defence of one choice or the other - but these moral decisions aren't actually what drivers do whether they're humans or a powerful AI.

People keep acting as though this is an unprecedented situation and invoking weird moral beliefs about thinking machines, when it's actually utterly routine. Let's try another exercise:

How many headlines have you read about a specific brand of elevator decapitating a child? Is it none? Do you see anybody pushing for the big elevator manufacturers to have to reveal how they "rank the value of different lives"? No?

That's not because nobody dies this way, it's because we say oh that's just a machine obviously if things go wrong you can get seriously injured and the machine doesn't know if you're a nun or a basketball champion it isn't trying to kill/ not kill anybody in particular it's just a machine.

Humans are often tempted to try "escape manoeuvres" and these almost invariably go wrong, we don't teach machines to try such manoeuvres because the machines are trained based on real performance data not someone's model of themselves as an immortal superhero.

One of the first Waymo crashes was somebody trying an escape manoeuvre. They found themselves in a potential collision so rather than the correct thing (brake to reduce speed, hit the thing you're colliding with because it's too close) they tried an abrupt swerve, lost control of course, crossed a median and smashed into the unrelated oncoming Waymo car at high speed writing off both vehicles. Humans do stuff like that, you can try training them not to but they won't listen. But the machines do not have that problem, so less "Should I kill the nun, the pregnant woman or the Olympic champion?" and more "Despite maximum braking effort a collision has become inevitable. Preparing safety systems for impact".