top | item 39377594

(no title)

Mageek | 2 years ago

The article says it: “We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle.” It was detected, but it predicted the truck would move in a way that it didn’t end up moving.

discuss

order

PH95VuimJjqBqy|2 years ago

I don't find that acceptable in any way, no human driver is going to do that and by that I mean no human driver is going to drive into something just because it moved in a way they didn't expect. they're going to slam on the brakes and the only way that's going to happen is if momentum is too high.

I understand we have to have explanations or we can't fix them, but it's just as important to understand this should never have happened even WITH the described failure.

If I had to guess, there's code to avoid stopping at every little thing and that code took precedence (otherwise rides would not be enjoyable). And I get the competing interests here but there must be a comparison to humans when these incidents happen.

duped|2 years ago

> no human driver is going to drive into something just because it moved in a way they didn't expect

I would actually put money that this is the cause of most crashes involving multiple moving cars. Hell, a friend of mine got into an accident two weeks ago where they t-boned somebody that turned onto a median when they didn't expect it.

dmd|2 years ago

> no human driver is going to drive into something just because it moved in a way they didn't expect.

This is literally the cause of almost every human accident.

Imagine you're driving. There's a car in front of you, also driving, at the same speed as you. Do you immediately slam on the brakes? No, because you EXPECT them to keep driving. That is how driving works.

If, suddenly, they do something unexpected - like slam on the brakes, that might cause an accident. Because ... they moved in an unexpected way.

I honestly can't even figure out what you meant to say.

BHSPitMonkey|2 years ago

> I don't find that acceptable in any way

Well obviously even Waymo agrees, given that they're recalling vehicles to mitigate the issue.

peddling-brink|2 years ago

If I have to choose between driving next to the nitwit texting or the software that might get tripped up in really unusual situations, I’m going with the software.

piombisallow|2 years ago

"no human driver"? Really? Ever? Are you willing to bet on that assertion? Even if the human driver downs a bottle of vodka before driving?

pests|2 years ago

How do you drive into a solid war if you have lidar though? To say nothing of predictions, the object is where it's at.... where it's at at that moment. You don't need to predict where it's at now... because you know where it's at.

bobsomers|2 years ago

You can't drive if you only use the current "frame" of data as the basis for your decision. Imagine driving on the highway, a comfortable distance behind a lead vehicle.

The planning software would want to slam on the brakes without predicting that the blob of sensor data in front of you is going to continue moving forward at highway speeds. That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.

A similar prediction error was the reason Cruise rear ended the bendy bus in SF a while back. It segmented the front and rear halves of the bus as two separate entities rather than a connected one, and mispredicted the motion of the rear half of the bus.

avalys|2 years ago

It wasn’t a solid wall, it was another vehicle.

The brakes don’t respond immediately - you need to be able to detect that a collision is imminent several seconds before it actually occurs.

This means you have to also successfully exclude all the scenarios where you are very close to another car, but a collision is not imminent because the car will be out of the way by the time you get there.

Yes, at some point before impact the Waymo probably figured out that it was about to collide. But not soon enough to do anything about it.

lazide|2 years ago

You can’t drive more than a few MPH unless you’re reacting based on the expected future, rather than the current one.

It’s why it’s so difficult to do (actually) and the ability to do it well is just as much about the risk appetite of the one responsible as anything else - because knowing if a car is likely to pull out at the light into traffic, or how likely someone is to be hiding in a bush or not is really hard. But that is what humans deal with all the time while driving.

Because no one can actually know the future, and predicting the future is fundamentally risky. And knowing when to hold ‘em, and when to fold ‘em is really more of an AGI type thing.

sokoloff|2 years ago

In self-driving, you are making predictions about where the object is right now based on the synthesis of data from your sensors (and often filtering information from past estimates of the object position). These might be high-precision, high-accuracy predictions, but they're predictions nonetheless.

(It's been quite some years since I worked on vision-based self-driving, so my experience is non-zero but also quite dated.)