(no title)
Mageek
|
2 years ago
The article says it: “We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle.”
It was detected, but it predicted the truck would move in a way that it didn’t end up moving.
PH95VuimJjqBqy|2 years ago
I understand we have to have explanations or we can't fix them, but it's just as important to understand this should never have happened even WITH the described failure.
If I had to guess, there's code to avoid stopping at every little thing and that code took precedence (otherwise rides would not be enjoyable). And I get the competing interests here but there must be a comparison to humans when these incidents happen.
duped|2 years ago
I would actually put money that this is the cause of most crashes involving multiple moving cars. Hell, a friend of mine got into an accident two weeks ago where they t-boned somebody that turned onto a median when they didn't expect it.
dmd|2 years ago
This is literally the cause of almost every human accident.
Imagine you're driving. There's a car in front of you, also driving, at the same speed as you. Do you immediately slam on the brakes? No, because you EXPECT them to keep driving. That is how driving works.
If, suddenly, they do something unexpected - like slam on the brakes, that might cause an accident. Because ... they moved in an unexpected way.
I honestly can't even figure out what you meant to say.
BHSPitMonkey|2 years ago
Well obviously even Waymo agrees, given that they're recalling vehicles to mitigate the issue.
peddling-brink|2 years ago
piombisallow|2 years ago
pests|2 years ago
bobsomers|2 years ago
The planning software would want to slam on the brakes without predicting that the blob of sensor data in front of you is going to continue moving forward at highway speeds. That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.
A similar prediction error was the reason Cruise rear ended the bendy bus in SF a while back. It segmented the front and rear halves of the bus as two separate entities rather than a connected one, and mispredicted the motion of the rear half of the bus.
avalys|2 years ago
The brakes don’t respond immediately - you need to be able to detect that a collision is imminent several seconds before it actually occurs.
This means you have to also successfully exclude all the scenarios where you are very close to another car, but a collision is not imminent because the car will be out of the way by the time you get there.
Yes, at some point before impact the Waymo probably figured out that it was about to collide. But not soon enough to do anything about it.
lazide|2 years ago
It’s why it’s so difficult to do (actually) and the ability to do it well is just as much about the risk appetite of the one responsible as anything else - because knowing if a car is likely to pull out at the light into traffic, or how likely someone is to be hiding in a bush or not is really hard. But that is what humans deal with all the time while driving.
Because no one can actually know the future, and predicting the future is fundamentally risky. And knowing when to hold ‘em, and when to fold ‘em is really more of an AGI type thing.
sokoloff|2 years ago
(It's been quite some years since I worked on vision-based self-driving, so my experience is non-zero but also quite dated.)