top | item 11788041

Andrew Ng calls Tesla irresponsible for shipping an imperfect autopilot

51 points| impish19 | 9 years ago |facebook.com | reply

61 comments

order
[+] xpda|9 years ago|reply
There will be better autopilots, but never perfect ones. Aircraft have been using imperfect autopilots since their inception. Most planes have a prominent red button you can press to disable the autopilot whenever it misbehaves.

I'm not sure why this post routes through facebook, but here is a link to the article: http://www.cnet.com/roadshow/news/model-s-on-autopilot-crash...

[+] erobbins|9 years ago|reply
The difference is when a plane's autopilot goes wonky you typically have minutes to realize it and take over manually. In a car you often have less than a second. Pilots also are highly trained and rehearse emergencies. Drivers put on makeup.

Anything less than a truly 100% autonomous car autopilot is 1 massive lawsuit away from failure.

[+] acqq|9 years ago|reply
Yes the link should be replaced with yours.

I believe that the autopilot that you have to override in critical moment is worse than the driving assistance that doesn't turn on until it's critical.

In the former case, if you turn on the autopilot that you know is not just keeping the speed constant, you will tend not to be fully concentrated on the road all the time. And you'll surely seldom be able to estimate when the autopilot will work and when it won't.

In the later case, you must remain aware of the situation and you are "in the loop" for the common cases, but if there's something you estimated wrong the clever radars, lidars and computers have the chance to react better than you are able.

[+] alistairSH|9 years ago|reply
Part of the problem is Tesla's "autopilot" isn't designed to be fully automatic. It's a diver's assistance package. As a comment in the Facebook thread mentioned, I too wonder if the problem is partial-autopilot in general. People will tend to put their trust in the system if it appears to be automatic. We probably shouldn't be shipping partial autopilot at all. Fully automatic or bust.
[+] akira2501|9 years ago|reply
> Most planes have a prominent red button you can press to disable the autopilot whenever it misbehaves.

Not really, or some pedantry.. the autopilot (heading and altitude) is disabled with a double-click on a small button either the yolk or the side-stick (boeing and airbus). Autothrottle (entirely separate) is disabled with a double-click on a small button on the sides of the throttle levers.

There is also a separate mechanism to disable it entirely, but it's usually a small button with an indistinct label (like "A/P" or "A/T") mixed in with a bunch of other buttons on the instrument cluster. Sometimes there is also a slightly larger "AP Disengage" button on some planes.

Most pilots I've seen will disable using the first method so they can be immediately "hands-on" when the system relinquishes control.

[+] argonaut|9 years ago|reply
One glaring difference is that aircraft pilots have substantially higher training requirements than car drivers. And the pilots of large aircraft that typically use autopilot, are all trained professionals.
[+] bargl|9 years ago|reply
First, Andrew Ng is a co-founder of Coursera and helped teach a few Machine Learning classes just so we all know he has the chops to say something like this. That's how I recognized his name a little more digging shows he really does have a good ML/AI background.

Second, He isn't saying an imperfect autopilot. He's saying that 1000 to 1 is NOT acceptable error. So the title is misleading, unless he edited his post.

Third, he is correct. Lay people who see this will immediately think that Autopilot is WAY to hard and that software devs will never get it right. I'm not saying they are right, but I am saying it's terrible PR for AI to release a system that is as buggy as this and assume zero responsibility.

[+] suprgeek|9 years ago|reply
Tesla is innovating rapidly in the Autonomous driving space AND releasing those inventions to the customer, which is great.

Typically these things used to take 5-7years to "trickle down" where a reasonably-priced car (not that $80k is reasonable) might have these available for end users.

However Tesla absolutely NEEDS to do more to prevent this technology push model that they have from tainting the whole Autonomous Driving trend. Expecting that a Driver will read the f'ing manual about

"....Traffic-aware cruise control may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) ....Depending on TrafficAware Cruise Control to avoid a collision can result in serious injury or death."

is insane.

My suggestion: Enforce a must watch video tutorial with questions (such as is common for say a driving test) before enabling the Partial AutoDrive. It really is a matter of Life or Death - you have a very short time to react on the road. With their distribution model Tesla definately has the resources to pull this off, and might serve as a model for the rest of the industry.

[+] vonklaus|9 years ago|reply
my guess is, without any insider information, they were (or at least would be) willing to deal with negative pr, for the amount of data this program is likely to get.'

While google (and many others) have been working on self-driving cars/testing systems, Tesla is a car company with many active vehicles. They benefit from huge fleets of vehicles (comparative to other autonomous car companies not car manufacturers) of which to test. So releasing this product and having potentially 50-100K people using it daily as well as over millions of miles logged could be net worth it.

They probably assumed (and likely calculated) a falure mode would be non-lethal/minimal dents and very small risk of human life. So they would allow the tech out there early and hope that a lot of this data helped to improve the vehicles.

It isn't PC to say, but this is how humans have learned most things. Being 90% something works, and then getting last 10 percent by learning what not to do which often comes at with a cost as high as human life. For example, the Challenger mission was a tradgedy and we, and very unfortunately some brave men and woman, paid a price for us to learn how to protect those future astronauts.

[+] vhold|9 years ago|reply
I kind of wonder if Tesla figures "Hey, once a few crashes happen people will figure it out."
[+] robbrown451|9 years ago|reply
I agree it is irresponsible. I think Google is correct in saying that self driving cars probably shouldn't even have steering wheels...it is absolutely unrealistic to think that people will hang out for an hour while the car does things correctly, and then be alert when it happens to not do things correctly.

Humans don't work like that.

[+] raverbashing|9 years ago|reply
From the Article

" However, as YouTube commenter Shaimach points out, the Model S manual calls out this exact situation as something drivers need to be aware of:

Warning: Traffic-aware cruise control may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on TrafficAware Cruise Control to avoid a collision can result in serious injury or death."

[+] sksixk|9 years ago|reply
i can't get my head around a car and a human driver sharing the driving duties at the same time. if you can't trust the car to do everything correctly, then what am i supposed to do as a driver?

sit there tensely with my right foot and hands hovering over the brake pedal and steering wheel?

[+] mikeash|9 years ago|reply
You watch.

By not having to devote attention to keeping your car within a foot of the lane's center, you have more attention to devote to the bigger picture of what's going on.

For example, have you ever tried to change lanes, done a quick shoulder check to see if the spot is clear, and then looked back ahead only to find that the car in front of started braking heavily and you're approaching it rapidly? With Autopilot this is not a concern, because you can let the car monitor the car in front, while you look to the side.

The first couple of hours with Autopilot, drivers typically are sitting there tensely ready to take over. But with experience, you learn how the system works, where it behaves well and where it fails, and you learn to relax, pay attention to the big picture, and let your strengths and the car's strengths complement each other.

[+] chadgeidel|9 years ago|reply
How is this different than the automatic transmission? Antilock brakes? Traction control? Lane departure/encroachment warnings? Adaptive cruise control?

Or even older tech like "drive by wire" (your foot isn't directly connected to the throttle bodies) or manually adjusting the choke in carbureted vehicles?

These are innovations that have happened in my lifetime. I'm sure there are even older "car and driver sharing duties" examples others can come up with.

[+] henrikschroder|9 years ago|reply
It lessens the cognitive load in easy driving situations, such as sitting in a queue on the highway. A hand on the wheel, your foot wherever you want, eyes on the road is enough. Just knowing that the car will take care of the small things is a massive offload.

Also, you learn what your car is good at and bad at, and plan accordingly. I know my car is bad at handling drivers changing into my lane too close to me, so I take over in those situations.

[+] archildress|9 years ago|reply
This is just the cost of progress. If every imperfect piece of technology didn't ship, we'd have no technology at all.
[+] dman|9 years ago|reply
One mishap which gets a lot of public scrutiny could set the field back by decades. When it comes to products with public safety implications its better to emulate the airline industry and their approach to safety rather than the "move fast and break things" mantra. For an example of how a few catastrophic events can destroy an entire industry / approach forever we just have to look at the nuclear industry with (chernobyl / three mile) and blimps (Hindenberg).
[+] riffraff|9 years ago|reply
not all technologies are created equal.

If my macbook's battery dies when it rains, it's not the same as if my car crashes into another one injuring multiple people.

[+] fucking_tragedy|9 years ago|reply
If your wife or child was injured because of this, would that still be just the cost of progress?

Technology can be refined and rigorously tested before it's pushed to consumers and that is still progress.

[+] nas|9 years ago|reply
That's not much of an argument. There has to be a standard where you consider the new and imperfect technology "good enough". Is Telsa's auto-pilot good enough? I don't know. My perception is that since Telsa is an underdog of the auto industry, they are more willing to take a risk and put the technology out there. If it turns out to be too unsafe, their company will take a huge hit. If not, they gain market share.
[+] drivingmenuts|9 years ago|reply
Well, seeing how Tesla just handed every auto insurance company in the world a free get-out-of-paying card for accidents dealing with that car, I'd say Tesla screwed up. Seeing as how Tesla has potentially exposed everyone involved in the supply chain to lawsuits, then yes, they screwed up.

If I were a well-backed lawyer, right about now, I'd be girding up to file requests for colonoscopies on the people who assembled the circuit boards in those cars.

Seriously, right about now, some lawyer is figuring out how to suck money out of Tesla until it's a dry, twitching husk in the noonday say. Tesla handed that guy a gift.

[+] kbenson|9 years ago|reply
Crash? That was an accident, and it was definitely the auto-pilot's fault, but to me that looks like a fender-bender. Let's resist the media's attempt to use more extreme wording to sell stories.

Yes, I know the driver prevented it from being worse, but that's one of the conditions Tesla's auto-pilot is supposed to be working under, a driver paying attention. There are problems with this model, but that doesn't mean we should ignore what actually happened in lieu of what could have happened when making a title.

[+] 27182818284|9 years ago|reply
There has actually been a movement since at least 2000 to stop calling accidents "accidents" because it was found to make it seem like "lol accident nothing could have been done" I can recall the word "accident" being dropped by driver instructors and such back then at least. It might even go back further.

Collision is better, but has a lot of letters so you condense it to crash.

[+] shogun21|9 years ago|reply
The autopilot is still in beta and warns drivers to still pay attention to the road.

As long as drivers have control, they are responsible for the vehicle. Only after we remove steering wheels, the accelerator, and brakes; and have a fully autonomous system should the car manufacturers (or software manufacturers) be blamed.

[+] mdorazio|9 years ago|reply
While that might be true purely from a legal standpoint, it just doesn't make much sense logically. Let's compare it to something like a gas stove. If you turn the stove on with a bunch of combustible stuff around it and then walk away for ten minutes and your house burns down as a result, that's pretty obviously your own fault. However, if you turn the stove on with just a normal pot of water on top, walk away for ten minutes, and it self destructs and burns your house down, what would you think? Is it your fault for not standing there with a giant fire extinguisher, or is it the stove company's fault for making a poorly QA'd product? I think Tesla's Autopilot feature is a lot more like the latter example than the former.
[+] qbrass|9 years ago|reply
They should be blamed simply for putting beta software in cars on public roads.
[+] sunstone|9 years ago|reply
This 'imperfection' probably needs to be weighed against the fact that Tesla drivers apparently drive 2.5 million miles a DAY on autopilot. Cars have other imperfections that lead to accidents as well, like obstructed vision and blind spots for example. The real question is, is the autopilot, imperfections included, statistically a better driver than a human pilot also with imperfections. One or two fender benders does not make the case.
[+] moron4hire|9 years ago|reply
Because the regular pilots are so good at the job?
[+] coralreef|9 years ago|reply
Tesla isn't responsible for how a driver drives their car.

But they are responsible for the software they ship.

[+] pj_mukh|9 years ago|reply
I dont understand, the ML systems he designs are not deterministic by design! He's been letting that stuff fly (literally) for over a decade now[1]!

P.S: I realize a "shipping" scenario is different.

[1]https://www.youtube.com/watch?v=0JL04JJjocc

[+] freddealmeida|9 years ago|reply
I'm just wondering if Tesla has responded to this? Do we know it was the Autopilot? Do we have evidence? I know Tesla captures a great deal of information from the car.
[+] 27182818284|9 years ago|reply
Cruise control can also lull you into not paying attention. Autopilot is simply marketing term for "advanced cruise control" which is what this really is right now.
[+] rasz_pl|9 years ago|reply
Tesla is NOT shipping an autopilot in the first place. They are selling LANE FOLLOWING driver assisted system and CALLING it autonomous AI "Autopilot".

This is the main problem.

[+] WalterSear|9 years ago|reply
An imperfect autopilot, running on extra cpu cycles from the automatic door lock processor.
[+] lacker|9 years ago|reply
If an imperfect autopilot is better than a human pilot, we should use it!
[+] hayd|9 years ago|reply
The collision itself looks surprisingly minor.