top | item 39375377

Waymo recalls software after two self-driving cars hit the same truck

212 points| reteltech | 2 years ago |cnn.com

79 comments

order

Fricken|2 years ago

A pick-up truck being towed crooked and backwards. Both vehicles failed to read the situation in the same manner.

Autonomous vehicles have various redundant systems built-in that can take priority and override false positives.

I was previously under the assumption that one of the really important reasons for Lidar is that it can get you closer to an absolute truth about whether something is a solid object, and where that hypothetically solid object is relative to the position of the vehicle, regardless of what the classifier thinks it is seeing.

So did the lidar fail to read the solid object, or was the lidar, was it de-prioritized? or was it simply not available as a fallback?

Presumably Radar and proximity sensors were also involved. What were they doing?

This is a fascinating edge case, and I hope to hear about the real reason for the 2 incidents.

Mageek|2 years ago

The article says it: “We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle.” It was detected, but it predicted the truck would move in a way that it didn’t end up moving.

carbonatedmilk|2 years ago

One of the best things I've learnt recently is how to apply the zero blame, process improvement approach that (many) air safety regulators take to my own teams.

I'd sat through 'five whys' style postmortems before, but it was reading air safety investigation reports that finally got me to understand it and make it a useful part of how we get better at our jobs.

By comparison, the way we're investigating and responding to self-driving safety incidents still seems very primitive. Why is that?

cameldrv|2 years ago

My guess is that Waymo does do this internally.

One difference with this situation in terms of the public perception/discussion though is that, say in the 1960s, air safety wasn't very good compared to today, but still there was no question of eliminating air travel altogether due to safety issues. Today there is definitely an anti-self-driving contingent that would like to hype up every accident to get the self driving companies shut down entirely.

extua|2 years ago

Another comparison with air safety, the disaster risk threshold is high enough to ground vehicles with suspected faults or flaws.

In this case two self-driving cars crashed into another road vehicle because they failed to recognise (in time) which direction it was moving. Waymo should be commended for having voluntarily issued a software recall, but this problem is severe enough that the decision shouldn't really be up to Waymo's good judgement.

esafak|2 years ago

Where did you learn about that approach, and how is it different?

Klaus23|2 years ago

Sounds like they were relying solely on their neural network path prediction, which failed when the truck was dragged at an odd angle.

A simple lidar moving object segmentation, which doesn't even know what it's looking at but can always spit out reasonable path predictions, would probably have saved them.

I think Mobileye is doing something like this, but they release so little data, which is always full of marketing bullshit, that it is hard to know what exactly they are working on.

digitallis42|2 years ago

It's unlikely to be neutral network based. This sounds like a model prediction failure. You take a mathematical model of car motion: the rear wheels generally don't steer. The front steered wheels can cause the car to drive along an arc. If you want to predict the arc that will be driven, you take the initial starting heading of the vehicle and project forward in time based on your understanding of the vehicles steering angle. For most "driving in lane at velocity" cases, you generally would assume that the vehicle has very little steering angle input.

We're now getting to see where autonomy needs to develop "spider sense": the scene in front of me feels wrong because some element isn't following the expected behavior maybe in ways that can't really be rationalized about, so we'll become much more conservative/defensive when dealing with it.

pinkmuffinere|2 years ago

I wish there was a picture of the strange towing configuration. I wonder if I would be confused as well, although my guess is that I’d read the situation correctly

midasuni|2 years ago

I’ve been confused occasionally by road situations. Somehow I’ve never managed to drive into a multi-ton object.

adrians1|2 years ago

This is what people don't appreciate when quoting those statistics about how self-driving cars are safer than humans: when a human driver causes an accident, it was because that particular person did something wrong. When a self-driving car handles a situation wrongly that's a big issue, because all the self-driving cars run the same software.

83|2 years ago

On the other hand when a human driver causes an accident one driver learns a lesson (maybe). When a self driving car causes an accident all cars get to learn from it.

skybrian|2 years ago

Yes, but when the bug was fixed, it was fixed everywhere.

im3w1l|2 years ago

This is my biggest fear with self-driving cars. Correlated failures. As a society we are extremely good at dealing with independent accidents. We can calculate very precisely how many people will die of traffic in a given year and we can account for it, we can have insurances, and we can decide exactly how much we are willing to spend to save a life on the margin.

But if everything is fine, everything is fine, everything is fine, and then all hell breaks lose? We are not as good at dealing with that.

brk|2 years ago

My fear is similar, but more along the lines of adversarial attacks as various weaknesses are exposed. Imagine people taking advantage of zero-day exploits that cause self driving cars to veer off the road, stop suddenly, collide, etc. It is really not that far fetched. This technology is very far away from maturity, IMO.

nicbou|2 years ago

On the other hand, the cars were recalled, a team will study what happened, the problem will get fixed, simulations and tests will be created.

When I make a dumb mistake, the other drivers rarely learn from it.

leoh|2 years ago

This will have to be priced in to insurance.

add-sub-mul-div|2 years ago

Yeah, there's nothing akin to a software update that would cause the entire fleet of human drivers to start driving badly or unexpectedly all at once.

We also know how to hold individuals accountable for independent accidents. We know we won't get justice when people will inevitably start to get killed by standard corporate greed, incompetence, enshittification.

rkagerer|2 years ago

That poor tow truck driver, must have felt like Skynet was really out to get him.

andreareina|2 years ago

Do we have a picture of the truck? I'm having difficulty imagining it given that surely the tow truck would want the towed vehicle in-line to make driving go smoothly?

rainbowzootsuit|2 years ago

The towed vehicle has its rear driven wheels up on a tow hook type of tow truck and It sounds like a locked steering wheel that has been turned. This would lead to the angled tracking of the front wheels.

This would be common for a debt recovery or when a city impounded the vehicle where it's taken without cooperation of the owner.

https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2F...

Terr_|2 years ago

[Recycled from a older submission] Well, I feel kinda vindicated by this news, after previously noting:

> People worry that ways and times [self-driving cars] are unsafe (separate from overall rates) will be unusual, less-predictable, or involve a novel risk-profile.

In this example, having a secretly cursed vehicle configuration is something that we don't normally think of as a risk-factor from human drivers.

_______

As an exaggerated thought experiment, imagine that autonomous driving achieves a miraculous reduction in overall accident/injury rate down to just 10% of when humans were in charge... However of the accidents that still happen, half are spooky events where every car on the road targets the same victim for no discernible reason.

From the perspective of short-term utilitarianism, an unqualified success, but it's easy to see why it would be a cause of concern that could block adoption.

camel_gopher|2 years ago

Human drivers just have secretly cursed driver configurations.