top | item 16645950

(no title)

aecs99 | 8 years ago

I currently work full-time in the self-driving vehicle industry. I am part of a team that builds perception algorithms for autonomous navigation. I have been working exclusively with LiDAR systems for over 1.5 years.

Like a lot of folks here, my first question was: "How did the LiDAR not spot this?". I have been extremely interested in this and kept observing images and videos from Uber to understand what could be the issue.

To reliably sense a moving object is a challenging task. To understand/perceive that object (i.e., shape, size, classification, position estimate, etc.) is even more challenging. Take a look at this video (set the playback speed to 0.25): https://youtu.be/WCkkhlxYNwE?t=191

Observe the pedestrian on the sidewalk to the left. And keep a close eye on the laptop screen (held by the passenger on right) at the bottom right. Observe these two locations by moving back and forth +/- 3 seconds. You'll notice that the height of the pedestrian varies quite a bit.

This variation in pedestrian height and bounding box happens at different locations within the same video. For example, at 3:45 mark, the height of human on right wearing brown hoodie, keeps varying. At 2:04 mark, the bounding box estimate for pedestrian on right side appears to be unreliable. At 1:39 mark, the estimate for the blue (Chrysler?) car turning right jumps quite a bit.

This makes me believe that their perception software isn't as robust to handle the exact scenario in which the accident occurred in Tempe, AZ.

I think we'll know more technical details in the upcoming days/weeks. These are merely my observations.

discuss

order

noobermin|8 years ago

Alright, so given your observations, which I don't doubt, here's a question I have: why have a pilot on public roads?

If uber's software wasn't robust, why "test in production" when production could kill people?

Slartie|8 years ago

> If uber's software wasn't robust, why "test in production" when production could kill people?

Because it's cheap. And Arizona lawmakers apparently don't do their job of protecting their citizens against a reckless company that is doing the classic "privatize profits, socialize losses" move, with "profits" being the improvements to their so-called self-driving car technology and "losses" being random people endangered and killed during the process of alpha-testing and debugging their technology in this nice testbed we call "city", which conveniently comes complete with irrationally acting humans that you don't even have to pay anything for serving as actors in your life-threatening test scenarios.

rkangel|8 years ago

Disclaimer: I am playing Devils Advocate and I don't necessarily subscribe to the following argument, but:

Surely it's a question of balancing against the long term benefit from widely adopted autonomous driving?

If self driving cars in their current state are at least close to as safe as human drivers, then you could argue that a short term small increase in casualty rate to help development rate is a reasonable cost. The earlier that proper autonomous driving is widely adopted, the better for overall safety.

More realistically, if we think that current autonomous driving prototypes are approximately as safe as the average human, then it's definitely worthwhile - same casualty rate as current drivers (i.e. no cost), with the promise of a much reduced rate in the future.

Surely "zero accidents" isn't the threshold here (although it should be the goal)? Surely "improvement on current level of safety" is the threshold?

aecs99|8 years ago

I have the same questions as well. But my best guess is that they probably have permission to drive at non-highway speeds at late nights/early mornings (which is when this accident occurred, at 10 PM).

My first reaction when I watched that video was that my Subaru with EyeSight+RADAR would have stopped/swerved. Even the news articles state something similar (from this article: https://www.forbes.com/sites/samabuelsamid/2018/03/21/uber-c...)

>The Volvo was travelling at 38 mph, a speed from which it should have been easily able to stop in no more than 60-70 feet. At least it should have been able to steer around Herzberg to the left without hitting her.

As far as why test this, I'm guessing peer pressure(?). Waymo is way ahead in this race and Uber probably doesn't wanna feel left out, maybe?

Once again, all of these are speculations. Let's see what NTSB says in the near future.

InclinedPlane|8 years ago

Because Uber wanted that.

Other self-driving car companies (like Google (or whatever they renamed it)) have put a lot more work into their systems and done a much greater degree of due diligence in proving their systems are safe enough to drive on public roads. Uber has not, which is why they've been kicked out of several cities where they were trying to run tests. But Tempe and Arizona is practically a lawless wasteland in this regard and is willing to let Uber run amok on their roads in the hopes that it'll help out the city financially somehow.

wklauss|8 years ago

I'm assuming LiDAR is not the only sensor installed in self-driving cars. Isn't that the case? And in this scenario, the software didn't have a lot to process. Road was empty, pedestrian was walking bike in hand perpendicular to road traffic...

Even if the detection box changed in size, it should have detected something. Tall or short, wide or narrow, static or moving... at least it should apply brakes to avoid collision.

IanWambai|8 years ago

I'm really surprised that we're even talking about the pedestrian's clothes or lighting or even the driver. Isn't the entire point of sensors like LiDAR to detect things human beings can't? The engineering is clearly off.

jiri|8 years ago

Is it possible for car to do some calibration of some sort to decide what is current "sensor visibility"? Like a human would do in a fog. Is this a common practice to use this information to reduce or alter speed of the car?

aecs99|8 years ago

Great question. At least in our algorithms we do this - to adjust the driving speed based on the conditions (e.g., visibility or perception capabilities).

At the end of the day, you can drive only as fast as your perception capabilities. A good example of that is how fast humans can perceive when influenced by drugs/alcohol/medications vs. when uninfluenced.

What is baffling is the fact that the car was driving at 38 mph in a 35 mph zone. This should not happen regardless of how well/poor your sensing/perception capabilities are.

whiskyant|8 years ago

Maybe the question isn't why the LIDAR didn't spot it. I feel it's more likely it did spot it, but couldn't make the correct decision.

aecs99|8 years ago

You summed up all my speculations in one sentence