Was discussing self-driving cars with a few colleagues on a drive the other day, and a question came up:
Does anyone have a sense how the cars handle situations where traffic is being directed by a human, with hand signals, etc? I.e. a cop standing in the middle of the road, waving through cars through an intersection where there's been an accident, etc.?
We pondered a few scenarios -- are they reading the hand signals? Are they judging other cars' movement? What if it's not a cop but a crazy random person who jumps into traffic and decides to play cop? We humans are remarkably good at determining "soft" things like "that guy looks crazy - let's get out of here" vs "that's a cop, I better pay attention", etc. The rabbit hole gets quite deep here...
> What if it's not a cop but a crazy random person who jumps into traffic and decides to play cop?
Most jurisdictions tell you to obey any random person directing traffic.
That's important, because there are good reasons why you'd want random passerby's to be able to direct traffic and be obeyed (eg a big accident that just happened, but it's hard to see). And obeying directions for traffic, even if you desperately want to obey, only really works when you can assume that other people on the road follow them as well.
Of course, if after the fact it turns out that the random guy didn't have a good reason for directing traffic, the law can get him. But you'd still have to obey him in the first place.
(A bit like a commander giving (bad) orders in the military.)
I actually ran into a simple example of a challenge like this. The power went out in mountain view a few weeks ago and a bunch of stop lights stopped working. I pulled up behind the self driving car which had stopped at the disabled light. Instead of treating the light like a stop sign, the car remained stop. After 5 or so seconds stopped at the light (there were no other cars at the intersection), I honked at them. I assume someone took manual control because it started to move right after.
While this situation could probably be addressed, it is a great examples of challenges these cars may come across.
Maybe we're good at differentiating humans with authority vs. humans without. But there's a much simpler social-engineering tactic for rerouting traffic that both humans and autonomous cars will fall for. I'll summarize it like this: road pylons are $8 at Home Depot.
The car could have a sensor platform that a remote driver could hook into and control over the air. In the 1% of cases the car gives up and can't handle, a human can pop in and guide it out of the situation before returning control to the car.
Is it just me, or does this incident seems like it was made worse by the autonomous mode?
A Google Lexus model AV was traveling northbound on El Camino Real in autonomous
mode when another vehicle traveling westbound on View Street failed to come to a stop at the stop
sign at the intersection of El Camino and View Street. The other vehicle rolled through the stop sign
and struck the right rear quarter panel and right rear wheel of the Google AV. Prior to the collision, the
Google AV’s autonomous technology began applying the brakes in response to its detection of the
other vehicle’s speed and trajectory. Just before the collision, the driver of the Google AV disengaged
autonomous mode and took manual control of the vehicle in response to the application of the brakes by the Google AV’s autonomous technology.
It looks like the Google Vehicle (GV) was traveling on El Camino (I'm assuming it had no stop sign), the computer saw the vehicle approaching (what should have been a stop) and hit the brakes. The driver took over (right before the collision) to stop the brakes from being applied and the car hit the rear of the GV. With out the computer controlling the speed, I wonder if the GV would have cleared the intersection already?
Then again, depending when each event happened, maybe if the driver didn't take over, the GV wouldn't have been hit? Hard to tell.
One thing I've noticed seeing the Lexus cars drive around Mountain View is that they're incredibly cautious.
I wonder what will happen once they become more commonplace, and human drivers realize they can be incredibly aggressive around the self-driving cars - cut them off, etc - and the self driving cars will happily accommodate the human driver, except with better reaction time and precision than a human driver could ever have.
It'll probably make a lot more sense when you have a majority of cars that can act as a swarm rather than having to tolerate irrational and unpredictable human drivers. I can imagine there will be eventually some sort of standard for near field communication of autonomous drivers that can give hints about behavior rather than needing to infer them. Doesn't even need to be trusted or authenticated, a driver can just make more detailed signals than the human readable yellow and red lights that we currently use. Even just being able to say that you'll be turning in 1KM would make things significantly more efficient to navigate around. It extends even further if you want to have drivers like ambulances and fire trucks to have "sudo" powers, autonomous drivers can get themselves into a formation to allow the vehicle through before it is even visible over the horizon.
>human drivers realize they can be incredibly aggressive around the self-driving cars - cut them off, etc - and the self driving cars will happily accommodate the human driver
well, as my driving school instructor was saying - "Do yield to the moron".
My assumption is that this is intended to make a good first impression with the public - even if making the cars more risk-taking would result in a lower accident rate, an accident where the driverless car is technically "at-fault" can be very damaging to the overall project. Googlers working on the driverless car have explicitly talked about cautiousness as a dial that they've turned all the way up - not as an inherent requirement of the technology, but as a conscious decision.
Yeah, case in point, I was jogging on Cowper st (wide, 2-lane residential) Tuesday morning and noticed a self-drive Lexus had come to a stop behind a Palo Alto garbage truck that was making its usual stop-go-stop way down the street. After several long seconds, another car came up behind and stopped. More seconds. Second car pulled out and passed both Google and garbage truck. I jogged on and never saw whether the Google car turned off, or continued to tailgate the truck -- but it didn't pass me.
Probably the cars will hit them. If you look at the disclosed accidents list that seems to be not entirely uncommon (even if it isn't the robot car's "fault")
That said, an algorithm that learns how to prevent people from cutting it off would be interesting indeed.
Not just other cars, but pedestrians too. It's common around here to see people trying to jam themselves into closing subway doors, often getting limbs or bags stuck in them. It's also not uncommon to see people try to help them by attempting to hold open or pry open the doors. One can imagine what some people will act like when they start to believe that cars will stop whenever they run out in front of them.
I really wish all cars would drive 25 mph on residential streets. The big limiter on car travel time is timing of traffic signals and overall road capacity, not top speed.
It’s very common around me for cars to be driving on semi-arterial residential streets (with official speed limit 25) at 55+ mph at 2 AM, which is both noisy and potentially unsafe: there’s low visibility at night and at that speed it would be hard to maneuver out of the way if there was something moved into the road.
If we had smaller cheaper lighter-weight cars which maxed at 25 mph on residential streets, plus maybe some of those 15–20 mph electric bikes which are everywhere in China, plus fast, frequent, and reliable mass transit, there’d be little negative impact on routine chores and commuting, but huge improvements in pedestrian and cyclist safety.
I don't mean to troll, but I really hope that that is not the final look of the car. Looks like one of those toy red and yellow cars [0]. Such an amazing engineering achievement deserves a more serious, futuristic look.
This was something that Elon Musk really got right with Tesla, he made it a point for his cars to have sex appeal. His model names are even S3X.
On the other hand I think this little car is probably going to be geared to the elderly or the disabled who would not be able to drive a regular car, so in that case sex appeal might not matter that much.
[+] [-] nlh|10 years ago|reply
Does anyone have a sense how the cars handle situations where traffic is being directed by a human, with hand signals, etc? I.e. a cop standing in the middle of the road, waving through cars through an intersection where there's been an accident, etc.?
We pondered a few scenarios -- are they reading the hand signals? Are they judging other cars' movement? What if it's not a cop but a crazy random person who jumps into traffic and decides to play cop? We humans are remarkably good at determining "soft" things like "that guy looks crazy - let's get out of here" vs "that's a cop, I better pay attention", etc. The rabbit hole gets quite deep here...
[+] [-] eru|10 years ago|reply
Most jurisdictions tell you to obey any random person directing traffic.
That's important, because there are good reasons why you'd want random passerby's to be able to direct traffic and be obeyed (eg a big accident that just happened, but it's hard to see). And obeying directions for traffic, even if you desperately want to obey, only really works when you can assume that other people on the road follow them as well.
Of course, if after the fact it turns out that the random guy didn't have a good reason for directing traffic, the law can get him. But you'd still have to obey him in the first place.
(A bit like a commander giving (bad) orders in the military.)
[+] [-] rabbidruster|10 years ago|reply
While this situation could probably be addressed, it is a great examples of challenges these cars may come across.
[+] [-] derefr|10 years ago|reply
[+] [-] shalmanese|10 years ago|reply
[+] [-] atallcostsky|10 years ago|reply
[+] [-] refurb|10 years ago|reply
A Google Lexus model AV was traveling northbound on El Camino Real in autonomous mode when another vehicle traveling westbound on View Street failed to come to a stop at the stop sign at the intersection of El Camino and View Street. The other vehicle rolled through the stop sign and struck the right rear quarter panel and right rear wheel of the Google AV. Prior to the collision, the Google AV’s autonomous technology began applying the brakes in response to its detection of the other vehicle’s speed and trajectory. Just before the collision, the driver of the Google AV disengaged autonomous mode and took manual control of the vehicle in response to the application of the brakes by the Google AV’s autonomous technology.
It looks like the Google Vehicle (GV) was traveling on El Camino (I'm assuming it had no stop sign), the computer saw the vehicle approaching (what should have been a stop) and hit the brakes. The driver took over (right before the collision) to stop the brakes from being applied and the car hit the rear of the GV. With out the computer controlling the speed, I wonder if the GV would have cleared the intersection already?
Then again, depending when each event happened, maybe if the driver didn't take over, the GV wouldn't have been hit? Hard to tell.
[+] [-] septerr|10 years ago|reply
[+] [-] bkjelden|10 years ago|reply
I wonder what will happen once they become more commonplace, and human drivers realize they can be incredibly aggressive around the self-driving cars - cut them off, etc - and the self driving cars will happily accommodate the human driver, except with better reaction time and precision than a human driver could ever have.
[+] [-] steckerbrett|10 years ago|reply
[+] [-] trhway|10 years ago|reply
well, as my driving school instructor was saying - "Do yield to the moron".
[+] [-] azernik|10 years ago|reply
[+] [-] fernly|10 years ago|reply
[+] [-] ChuckMcM|10 years ago|reply
That said, an algorithm that learns how to prevent people from cutting it off would be interesting indeed.
[+] [-] Chathamization|10 years ago|reply
[+] [-] protomyth|10 years ago|reply
What are the speed limits of the roads the car will be driving on?
[+] [-] jacobolus|10 years ago|reply
It’s very common around me for cars to be driving on semi-arterial residential streets (with official speed limit 25) at 55+ mph at 2 AM, which is both noisy and potentially unsafe: there’s low visibility at night and at that speed it would be hard to maneuver out of the way if there was something moved into the road.
If we had smaller cheaper lighter-weight cars which maxed at 25 mph on residential streets, plus maybe some of those 15–20 mph electric bikes which are everywhere in China, plus fast, frequent, and reliable mass transit, there’d be little negative impact on routine chores and commuting, but huge improvements in pedestrian and cyclist safety.
[+] [-] mcintyre1994|10 years ago|reply
[+] [-] msoad|10 years ago|reply
[+] [-] eru|10 years ago|reply
[+] [-] shpx|10 years ago|reply
[0] https://s-media-cache-ak0.pinimg.com/736x/2c/7b/4b/2c7b4b50b...
[+] [-] lotu|10 years ago|reply
On the other hand I think this little car is probably going to be geared to the elderly or the disabled who would not be able to drive a regular car, so in that case sex appeal might not matter that much.
[+] [-] jliptzin|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] dzhiurgis|10 years ago|reply
Of which only a fraction was in self driving mode.
They spent 1 million miles learning the model.
[+] [-] EmployedRussian|10 years ago|reply
(The total miles driven in both manual and self-driving mode is closer to 2 million miles I believe.)