top | item 16652572

Video suggests huge problems with Uber’s driverless car program

211 points| yohui | 8 years ago |arstechnica.com

231 comments

order

phyller|8 years ago

"Uber recently placed an order for 24,000 Volvo cars that will be modified for driverless operation. Delivery is scheduled to start next year."

Wow. The other driverless car players should be all over this lobbying to shut Uber down. If Uber massively deploys a commercial service with subpar quality in order to "win", and then those cars start getting into accidents, the entire field is going to be delayed by 10 years. The general public is not going to just think "Uber is bad", they are going to think "self-driving cars are bad". Politicians will jump all over it and we'll see very restrictive laws that no one will have the guts to replace for a long time.

And honestly if that happens, that's probably what we would need anyway. If the industry doesn't want to be handcuffed they need to figure out some really good standardized regulations on sharing data with law enforcement, how to determine fault for self-driving vehicles, and what penalties there should be. That are fair and strict.

selestify|8 years ago

Could Uber be doing this on purpose? They're way behind on self-driving tech, so maybe if they can't have it, no one can.

pdkl95|8 years ago

> We don't need redundant brakes & steering or a fancy new car; we need better software," wrote engineer Anthony Levandowski

Any engineer with this attitude needs to learn the lesson of the Therac-25. The issues in the Ars article are very similar to section 4 "Causal Factors" of the report[1].

> To get to that better software faster we should deploy the first 1000 cars asap.

Is that admitting that they do not have the "better software" and intend to deploy 1000 cars using "lesser software"? That's treading dangerously close to potential manslaughter charges if prove this willful contempt for safety to a court.

[1] http://sunnyday.mit.edu/papers/therac.pdf

phlo|8 years ago

> willful contempt for safety

To play Kalanick's adversary, he might be arguing for more real-world data collection. Tesla famously equipped most of their cars with more sensors than were required at the time of delivery, using the data to drive development of the Autopilot function that was later added to the cars.

PinkMilkshake|8 years ago

Agile methodology applied to safety-critical systems engineering, god help us.

IshKebab|8 years ago

> We don't need redundant brakes & steering or a fancy new car; we need better software," wrote engineer Anthony Levandowski

He is clearly right about that. Human-driven cars are safety critical and already do fine without redundant brakes and steering. How many crashes are due to brake or steering failure? I'm guessing it's well under 10%.

Most human crashes are due to bad driving, and for computers it will be the same. I mean, even this fatal crash probably could have been prevented with better software. It's not like the brakes failed. They just weren't applied.

> To get to that better software faster we should deploy the first 1000 cars asap.

This is where he is totally mad.

iClaudiusX|8 years ago

I get the feeling based on comments here that there is a severe lack of ethical and critical thinking among engineers and developers. I recognize that this is only a vocal minority but the constant mantra of "move fast and break things", where getting rich at any cost is seen as a virtue, has made me extremely disillusioned with this brand of startup culture. Doubly so when people are trading stock tips on how to profit from tragedy by supporting the worst actors in the field.

wdr1|8 years ago

Is it better for self driving cars to have a flawless record & low adoption, or to have a 100x improvement over human drivers & broad adoption?

Creating the perfect self-driving car, with redundant systems, safety everything & so on will certainly help its safety records.

But it will also drive up the cost.

And put it out of reach for a lot of people.

If the goal is to save lives, the bar self-driving cars should be held to is what humans do driving today, not perfection.

JohnJamesRambo|8 years ago

"Uber announced that it had driven 2 million miles by December 2017 and is probably up to around 3 million miles today. If you do the math, that means that Uber's cars have killed people at roughly 25 times the rate of a typical human-driven car in the United States."

Wow there goes that "safer than human drivers" argument.

mabbo|8 years ago

With one data point, you can't extrapolate much. This is misuse of statistics.

Consider if there was a new lottery and you weren't sure what the odds of winning were. You play it three weeks in a row and the third time you win a million dollars. Conveniently, no one else tries the new lottery yet.

Does it follow then that the odds of winning a million dollars are 1 in 3? Or should you play it a few more times before you declare to all that one in three plays will make one a millionaire?

imh|8 years ago

Driverless cars are also being tested in relatively nice driving conditions. People, on the other hand, drive in all sorts of conditions. X deaths per Y easy driving miles is going to translate to many more than X deaths per Y representative driving miles.

gtm1260|8 years ago

I think focusing on the safety statistics is somewhat of a red herring. Uber wins when you get hung up on how well its self-driving cars drive, because that's something that they can improve. Instead, I think we should focus on the fact that these dangerous machines are being operated by chronically irresponsible companies and because cars in general have issues, not because we expect them to be less safe than human drivers.

corny|8 years ago

Any comparison between self-driven miles and typical human-driven miles has to take into account all the times a safety driver took over driving to prevent an accident. Those self-driven miles have a huge asterisk.

mozumder|8 years ago

It's much, much worse than that, since that 37,461 deaths number includes all deaths, including motorcycle/Truck/SUV deaths, which have higher death rates than passenger cars, perhaps 5x-10x higher.

A proper comparison in this case is comparing passenger car death rates.

And then you need to factor in other conditions, such as the fact that weather was clean, and that you should be looking to compare pedestrian/bicyclist deaths, and you see that this incident already throws out wack the death rate for autonomous vehicles.

username223|8 years ago

Given exponentially-distributed distance between fatalities, this would have a 3% chance of happening if Uber cars were as safe as humans. So it's unlikely.

ben174|8 years ago

Well maybe not currently safer than human drivers. I don't think any sound-minded person would claim that they will never be.

reedx8|8 years ago

You can't just look at Uber to make a sweeping conclusion of all autonomous cars on the road. How many miles has Tesla, VW, Volvo, Waymo, Google, Ford, and Apple have?

akkat|8 years ago

I don't know how to say it kindly but there is a difference between "what" type of people the car killed. If the fatality was another rule abiding driver on the road or a pedestrian crossing at a crosswalk that would be really bad. However if it was someone not following the safety rules by j walking, then that person accepted upon themselves a higher probability of being in an accident. When making laws, for the most part, they are for the benefit of law abiding people.

vcanales|8 years ago

/s ?

It's a pretty unfair comparison, with 1 death on one side and over 30k on the other...

buildbot|8 years ago

I don't understand how there isn't a non-ML based piece of code that looks at moving radar and lidar returns and preforms an emergency brake, light flash, horn, or dodge if that vector would intersect with any confidence. Even slowing down to 20MPH can turn a fatal accident into an injury.

KKKKkkkk1|8 years ago

What if it has nothing to do with ML? You see a point cloud that's moving toward your lane at a speed estimate of say 2 mph. If that's below the sensor noise threshold, you might classify the cloud as a stationary object on the other lane (say a stranded car). In that case, by the time you realize that this stationary object has somehow moved itself into your own lane, it is already too late.

oldgradstudent|8 years ago

Because it would result in too many false positives, which could be just as bad as not stopping.

For example, unnecessarily stopping in a middle of a highway is extremely dangerous, especially if visibility is limited or roads are slippery.

sitkack|8 years ago

> We don't need redundant brakes & steering or a fancy new car; we need better software," wrote engineer Anthony Levandowski to Alphabet CEO Larry Page in January 2016.

Looks like Uber has attracted Levandowski due to his cultural fit.

comex|8 years ago

Hmm, but wouldn’t his priorities be correct in the context of this crash? There hasn’t been any suggestion (so far) that the crash occurred because some hardware component stopped working; rather, it seems like the software failed to identify the pedestrian in time. So better software seems precisely what was needed. Though I can imagine that better sensors might also have helped…

etimberg|8 years ago

Stuff like this is why P.Engs should be required in certain software fields

diggernet|8 years ago

I'd upvote you to the top of the page if I could.

matte_black|8 years ago

Why don’t we require software engineers who work on self driving car software to go through licensing and certification.

And then, if their code results in a death, they are liable and can have their license completely revoked, and they would be unable to work on self driving cars again.

superfrank|8 years ago

- Expecting engineers to always write perfect code is insane. Mistakes happen.

- If bad code makes it into production, that is a systemic failure not an individual one (Why didn't the bug get caught in code review, QA, etc.)

- No one is going to want to work on a project where a single failure can taint their career.

- What if I use a 3rd party lib and that is where the bug is. Who is at fault then? What if the code isn't buggy, but I'm using it in an unexpected way because of a miscommunication? If I am only allowed use code that I (or someone certified has written) development is going to move at a snails pace.

- What if I consult with an engineer who doesn't have a certification on a design decision and the failure is there, who is at fault?

- What if the best engineer on the project makes a mistake and ends up banned? Does he/she leave the project and take all their tribal knowledge with them, or are they still allowed to consult? If they can consult, what stops them from developing by proxy by telling other engineers what to write?

Not to be a dick, but this is an awful idea that would basically kill the self driving car.

harshbutfair|8 years ago

From many years developing safety critical software, I reckon culture and processes are more important than certification. There are various standards for developing safety systems in other industries (defence, aviation, etc) and these standards exist for a reason. Have Uber applied any standard for their automation software? Or equivalent development processes? "Move fast and break things" is fine for an app, but not fine for controlling a vehicle.

lhorie|8 years ago

My guess is that it's because the field is so new that there aren't really any experts that can define what are reasonable rules for said licensing and certifications

PinguTS|8 years ago

We have this requirement, at least in Europe. There are even ISO standards to follow. The one related is ISO26262. But it seems, this does not apply to those permits issued for these cars by Uber.

mr_toad|8 years ago

Engineers aren’t in charge. Unlike lawyers, surgeons, even dental hygienists, they aren’t making the calls.

namelost|8 years ago

Obligatory: https://www.fastcompany.com/28121/they-write-right-stuff

If you want error-free software you need a blameless culture based around process, not individual ownership of code. It should not even be possible for an error to be one individual's mistake, because by the time it hits the road it should have gone through endless code review and testing cycles.

ubernostrum|8 years ago

Uber will relocate the engineering team to a jurisdiction without those regulations.

Just like it relocated its testing to get a more "business friendly regulatory environment".

KKKKkkkk1|8 years ago

There are existing laws on the books for this. Please google negligent homicide. Licensing and certification serve a different purpose.

SamReidHughes|8 years ago

Then we would never have self-driving cars.

aylons|8 years ago

"Move fast and break things" is exactly the opposite of what a responsible driver must do.

dmix|8 years ago

Not sure how that could be the philosophy of any self-driving car company?

That'd be extremely foolish. And regardless of the dumb things the previous Uber CEO has done in the past and the big deal people are making over a $150 license, they have still hired some of the best engineers in the world.

You basically have to find the brightest-of-the-brightest to build AI... and Uber pays very well and puts plenty of effort into recruiting that talent.

Not to mention the massive PR and monetary risks that are inherent in killing people with your products. That would make any company highly risk-adverse.

telchar|8 years ago

I've been joking for at least a year that Uber's motto is "move fast and break people". I'm saddened that this has come to pass but not surprised.

sureaboutthis|8 years ago

I have two problems with this article.

1) They make it appear that Uber is a car manufacturer.

2) Even though Uber has not been determined to be at fault, the author seems to want to make it that way anyway.

TillE|8 years ago

Every engineer on this project at Uber knows very well that their car completely failed in one of its most basic expected functions. It's incredibly obvious, and a number of independent experts have said as much.

I'd be fairly surprised if there's any real appetite at Uber to continue with this now. It was never anywhere near their core competency.

CydeWeys|8 years ago

The article addresses your second point:

"Indeed, it's entirely possible to imagine a self-driving car system that always follows the letter of the law—and hence never does anything that would lead to legal finding of fault—but is nevertheless way more dangerous than the average human driver. Indeed, such a system might behave a lot like Uber's cars do today."

It doesn't matter if Uber makes cars that are technically not at fault, if they're mowing over pedestrians at a rate significantly higher than human drivers then they should never be allowed on public roads. People mess up occasionally. The solution is not an instant death sentence administered by algorithm.

vamin|8 years ago

The author is making a distinction between whether Uber was legally at fault (as stated in the article, likely not) versus whether the accident was avoidable. I agree with the author's position that the accident was likely avoidable.

hndamien|8 years ago

I think the standards are different in this case. While the pedestrian definitely should not have been where they were, and if this were an incident with a human driver, you would probably say the driver was not at fault, I think this is slightly different.

They are on the road with conditions because what they are doing is somewhat experimental still. There is a safety driver for a reason that did not respond. A human driver may have collided but would have responded and potentially avoided a fatality (if not a collision). The benefits of autonomous driving completely failed on all counts in this case, which imply that being on a public road is far to early for Uber - suggesting some fault to lie with Uber or the regulators.

saas_co_de|8 years ago

The other missing part is that it is the human driver who is responsible. This is a test vehicle and their job is to be ready to take over at any time as if they are driving.

It seems unlikely that the Police will find any fault because they probably don't want to have to file a criminal charge against the driver, but that is who it would go against if there was fault.

Tobba_|8 years ago

Yeah I'd be fairly concerned about them lying to or simply bribing the police too.

joejerryronnie|8 years ago

Why is everyone considering it a forgone conclusion that self driving cars will quickly become much, much safer than human driven cars? Yes, lots of people die every year in human driven car accidents. But it is equally true that our most sophisticated AI/ML can only really operate within very narrowly defined parameters (at least when compared to the huge sets of uncertain parameters humans deal with every day in the real world). Driving is perhaps one of the most unpredictable activities we can engage in, anecdotally supported by my daily commute. What if our self driving software never becomes good enough? How many more deaths are we willing to go through to find out?

speedplane|8 years ago

I was at SXSW a few weeks ago and went to an Uber driverless car talk. They spent the first half of the talk discussing driver safety, it felt incredibly hollow.

If you really cared about safety, there are far more immediate and impactful solutions then spending billions on self-driving cars. If they came out and said that they were doing it to make money or make driving easier, it would have carried more weight. But you just can't trust a word this company says.

RcouF1uZ4gsC|8 years ago

Are you suprised? Uber is a company that:

* Flouted Taxi regulations

* Living in legal gray zones in regards to contractors vs employees

* Designed a system to avoid law enforcement

* Performed shady tactics with its competitors

* Illegally obtained the private medical records of a rape victim

* Created a workplace where sexual harassment was routine

* Illegally tested self-driving cars on public roads in California without obtaining the required state licenses.

* Possibly stole a LIDAR design from a competitor

Now their vehicle killed a pedestrian in a situation that the self driving vehicles should be much better than humans at (LIDAR can see in the dark, and the reaction time of a computer is much better than humans.)

Uber has exhausted their "benefit of the doubt" reserve. Maybe, they need to be made an example of with massive losses to investors and venture capitalists as an object lesson that ethics really do matter, and that bad ethics will eventually hurt your bank account.

dsfyu404ed|8 years ago

>"One of my big concerns about this incident is that people are going to conflate an on-the-spot binary assignment of fault with a broader evaluation of the performance of the automated driving system, the safety driver, and Uber's testing program generally,"

Self driving cars are currently in that state where they're always in accidents but never technically at fault. When individuals have this behavior patter their insurance company drops them because if they're so frequently present when shit hits the fan they're a time bomb from a risk perspective.

Edit: meant to reply to parent, oh well.

Stanleyc23|8 years ago

if they are at fault they should be punished, but you do realize that the expectation for self driving vehicles is not to eliminate all car related deaths, right?

edit: wow this triggered some people. somehow 'if they are at fault they should be punished' got interpreted as 'they are not at fault and should not be punished'

throwaway010718|8 years ago

Machine learning and AI are data hungry algorithms and the concern is there isn't enough "emergency situation" data. Also a detector can not have both a 100% probability of detection and 0% probability of false alarm. You have to sacrifice one for the other and that is usually influenced by weighted probabilities and priorities (e.g., a smooth ride).

d--b|8 years ago

Uber's culture is bad for anything really...

kristianov|8 years ago

I hope on-road testing could be more like human drug testing. After all, both affect human lives.

oldgradstudent|8 years ago

Drugs are tested under informed consent, not on unwilling third parties.

Maybe testing of autonomous vehicles should be done off public roads (at least at this stage of development).

ghfbjdhhv|8 years ago

This event has me thinking about job of the behind-the-wheel backup driver. They get an easier job than a real driver, at the cost of potentially taking the fall if an accident occurs. I wonder if the pay is better.

TylerE|8 years ago

I actually don't think it's really easier. Continuous attention is easier to maintain than hours of boredom only to have to react out of nowhere...maybe.

icc97|8 years ago

It's an equivalent job to a train driver

icc97|8 years ago

I thought the speed limit was 35mph, but the article claims 40mph.

bambax|8 years ago

"Testing" of driverless cars seem to be the wrong way around. Software should try to learn from human drivers: watch them instead of being watched by them.

The way it would work would be: the human is driving and the software is, at the same time, watching the driver and figuring out an action to take. Every time the driver's and the software's behavior differ, is logged and analyzed to figure out why there was a difference and who guessed better.

But the way testing is currently going on, it seems millions of miles are wasted where nothing happens and nothing is learned.

Animats|8 years ago

No, what that gets you is smooth normal driving and poor handling of emergency situations. People have tried using supervised learning for that - vision and human actions for training, steering and speed out. Works fine, until it works really badly, because it has no model of what to do in trouble.

c06n|8 years ago

> Software should try to learn from human drivers

Yeah, that doesn't work though. Basically because you would need to have an excellent situation representation to really understand the drivers' reactions to outside events. But that does not exist.

Perception and situation representation are key to mastering the driving task, and they both differ greatly between humans and machines.

stouset|8 years ago

At some point you still need to switch roles. We’re at that point.

carlsborg|8 years ago

Agree. Uber should have fitted their taxis with data devices.

aaroninsf|8 years ago

Surprising no one.

"win-at-any-cost" and "second place is first looser" (sic) do not cohere with safety.