top | item 10910938

(no title)

timothya | 10 years ago

> For some time, Google has been convinced that the semiautonomous systems that others champion (which include various features like collision prevention, self-parking, and lane control on highways) are actually more dangerous than the so-called Level Four degree of control, where the car needs no human intervention. The company is convinced that with cars that almost but don’t drive themselves, humans will be lulled into devoting attention elsewhere and unable to take quick control in an emergency.

I think this is a really good perspective. Considering how often drivers are already doing things like using smartphones behind the wheel of non-self-driving cars, I think that sort of activity is only magnified by partial autonomy - which is very dangerous! Humans get distracted or bored easily, especially when completing routine tasks. I'm glad that Google is choosing to build a car that never needs human intervention rather than rushing to market with a partial solution.

Here's a video where you can see what distracted teen drivers look like. Terrifying. http://youtu.be/SDWmwxQ_NnY

discuss

order

ghaff|10 years ago

On the one hand, they're of course correct. As with automation generally (whether cars, airplanes, or software deployment), once you get to a certain level of automation, you pretty much have to be all in because humans can't act quickly enough or with enough throughput.

On the other hand, it's easy to see why auto manufacturers and others are disinterested in an all-or-nothing goal that is likely to be decades away. Because they want incremental features they can sell in the interim.

Of course, their challenge is around what incremental approaches work given that humans will not pay attention once you reach a certain level of automation. Perhaps you enable full automation only under scenarios where it works reliably--say freeways in certain weather conditions--and is legally allowed under those circumstances. (Though I suspect the first step is that people will use "autopilots" and go ahead and play with their phones--even though they're not supposed to--given that many already do that today.)

rconti|10 years ago

I concur. Here's a comment I made on another thread about self-driving cars, and my experience with even rudimentary assistance features on my 2016 VW (coming from a car with no such features whatsoever):

https://news.ycombinator.com/item?id=10735434

ZeroGravitas|10 years ago

I have some sympathy with this view, however I don't see how self-parking and collision avoidance fit in? These seem like ideal places for computer drivers to show their value without affecting most driving experiences.

oska|10 years ago

I've seen discussion of this issue in the domain of pilots with commercial airlines. The suggestion was made that because so much of flying is now done by autopilot, pilots' ability to react quickly and appropriately in a real emergency when control is handed back to them has significantly declined. And that we may soon go to completely pilotless airliners which are taken over by ground control in case of emergency. (This would also have the side-benefit of significantly reducing the risk of hijacking).

ghaff|10 years ago

I doubt it. Pilots are a pretty trivial cost in airline operations and there are a lot of reasons to have a human who is definitively in charge on the aircraft.

cbhl|10 years ago

How does this reduce the risk of hijacking? An attacker would just hijack ground control instead.