"Truck reverses into stationary object" is not really news.
I think there are too many possible risks of programming the vehicle to do anything except stop dead in an emergency situation - you rapidly get into very complex programming with all kinds of failure modes - e.g. what if the problem was a faulty sensor? The vehicle might try to avoid a non-existent threat by crashing into something else. Stopping is the safe thing to do. If you want to react, you rapidly have to make lots of moral decisions like (The Trolley Problem).
Briefly sounding a horn in an unexpected emergency stop situation is probably a good idea, particularly if it's a white noise type thing, rather than a siren, so that other road users can localise it's source quickly.
A horn needs to be a horn, add a white noise generator too, but the recognised noise and the one people are "programmed" to react to is a car horn.
Aside: emergency vehicles in the UK have stopped using the broad-band noise sirens AFAICT, I don't know why but the reason may be pertinent to any attempts to use it here? Actually, they often ride without the siren at all, which is completely ridiculous to me; they put it on at the junction meaning there's no time to get out of the way, if they used it continually you can hear them coming.
I'd love to show you the video, but it's been removed. It shows a Starship technologies sidewalk delivery robot attempting to cross the road, when a car making a right turn through the crosswalk hits it, in a low speed collision. The interesting thing is that the little robot attempted to make an evasive maneuver by stopping and quickly backing up to avoid the car. It ultimately failed, but at least it tried.
So Navya, the French company who has deployed several of these slow moving autonomous shuttle busses does not appear to have such sophisticated behavioral algorithms, in spite of entertaining ambitions of building proper robotaxis.
> I think there are too many possible risks of programming the vehicle to do anything except stop dead in an emergency situation - you rapidly get into very complex programming with all kinds of failure modes
Perhaps that limitation should disqualify AI from driving cars.
Horn – It may not seem the most important piece of safety equipment, and many big cities even limit how it can be used, but to be street legal every vehicle must have a horn that is audible for at least 200 feet. The horn can generally be any note or sound (even ones that play musical tunes are usually permitted), so long as the minimum volume requirements are met.
Sarcasm aside, who is deemed to be in control, who do you sue for the damage if this vehicle crashes in to you - isn't it the manufacturer that causes the crash (not in this case necessarily, I'm talking generally).
Ugh. This really worries me - not because I’m afraid of driverless cars but because this is the kind of “news” headline that will get anti-driverless car jerks all up in a righteous tizzy.
Title should be NO_TITLE because “doofus driving truck backs into something” isn’t news.
Counterpoint, as technical people, we can agree that there is probably a better solution to an impending accident rather than sit still like a deer in the headlights.
In fact, this article could start a discussion on if the risks of active accident avoidance measures (e.g. running a stop light at a clear intersection if you're about to be rear-ended) might cause more problems than they solve.
The title needs to be changed to, "Human in truck, backs up and hits vehicle (self-driving)" ?
Because that's what happened. A human caused a wreck. Doesn't seem as scare-mongery now, eh?
A persistent vision system, like 360 radar would have been fully aware of a vehicle at any direction, and would have not hit them. A human on the other hand.....
At the same time there are issues that need to be addressed / forced. I'm still waiting for mean spirited drivers to start bullying self-driving cars. For instance commuters cutting them off knowing they will stop to avoid an accident.
Which leads to another question. Will driverless cars report bad drivers to police? Making tech companies police informants... will laws be pass forcing this behavior?
These things will need to cope with the presence of real humans, or else it's never gonna work. See how far this attitude of "the computers would be fine if it weren't for the stupid pesky humans" takes you.
The anti-driverless car “jerks” (as you have labeled them) will get into a righteous tizzy no matter if events like this happen or not. However, I characterize your response as emotional, and I feel a data-driven response is what is appropriate. So far, autonomous vehicles have a better safety record than human-driven vehicles. That’s the message here. Clouding that with emotional responses and demonizing opponents only serves to distract from that fact.
This feels like a really tricky question, actually -- what to do when an moving object is headed towards a stopped self-driving car?
It's easy to say the car should be smart enough to move -- but what if, as it moves in one direction, the object (like a truck trying to avoid the car) suddenly swerves in that direction too? Then does the car become responsible for the collision?
And of course, it feels like there could be a real-world version of the trolley problem [1] -- what if there are 5 occupants in the vehicle who will be killed by an oncoming truck, but in the only direction where it can move out of the way, it will have to run over a single pedestrian?
Glad I'm not the one having to make these kinds of programming decisions.
I would guess that it's not a tricky question at all for the company, because the public reaction to a driverless car being hit while stationary is very different from the public reaction to a driverless car being involved in a collision while moving. People accept the former as a cut-and-dried situation but treat the latter as a he-said, she-said situation where they're free to project their own biases onto the situation. (Was the computer really not at fault? Like the globalist media would ever tell it straight when it comes to robots putting people out of work. Tech companies are never held responsible. Et cetera.)
Also, this kind of story is new and interesting right now, so people will pay attention to the details, they'll make up their mind how to feel about later stories that they won't bother to read because it will be boring by then. It's amazing luck for the company to get in a "hit while stationary" story right at the get-go to help groove the narrative that driverless cars are safer than human drivers.
From a public perception standpoint it would be so helpful (dons tinfoil hat) that you could almost imagine them arranging some accidents of that kind. I think it would be a dumb chance to take, so I don't believe that's what happened, but I bet they're fist-pumping over their good luck.
There was someone on twitter who very effectively demolished all this "trolley problem" speculation by pointing out two things:
- the knowledge of the world held by cars is partial and probabilistic. As it is with real humans, only with very different obscure failure modes (see for example the "adversarial images").
- if you get into this situation, something has already failed. Either in your safety practice or someone else's. That reduces the predictability of the situation and also the predictability of what your actions will do in that case.
Put those two together and you realise that any kind of "deliberate" (in the sense that one can impart "intent" to a computer system?) running over of a pedestrian is indefensible, and that real situations don't admit of neat counter-factuals like this.
Lay on the horn for starters. That's what I've done in a similar situation and it worked. Also, just back up a bit (also done that). It's not hard for a person or a self-driving car to ascertain that you're gonna get hit if you stay where you are but that if you back up a little bit while honking you buy yourself more room.
It can't know that the truck is going to kill the 5 passengers so it can't take aggressive preventative action, it has to do the conservative thing, which is to move as much out of the way of the truck as is possible without hitting an object.
In this case, it was a low-speed incident and the trucker was at fault. If the car didn't have the option to move backwards or avoid the incident without damage or injury, staying put is the best option. I don't see a problem here. Unless of course the truck was out of control and would've crushed the car.
There should be an emergency exit option on cars like this though.
> what to do when an moving object is headed towards a stopped self-driving car?
Nothing. Whoever is controlling that moving object is responsible. So, provided the car is not stopped on railroad track (in which case - what they heck is it doing stopping on railroad track?!) it should just stay there and not add variables to the situation by jerking around.
The Machine acted in a way that was not expected by the human driver.
This gets to a fundamental issue we will have in the transition to the utopian driverless future. Learning to identify and react to this new class of driver.
That doesn't seem like a fair summary. A human-controlled truck slowly backed into it and hit it. The autonomous vehicle was unable to react to the situation, though all the humans in it had plenty of time to see this coming and react.
I'd say the shuttle's insurer should be partly responsible here. The Engadget article seems to show the shuttle stopping in the truck's blindspot.[0]
A reasonable truck driver would assume the driver of the other car would back up a little bit. But in this case, the car stopped where the truck couldn't see whether it was backing up or not, and wasn't programmed to understand the truck's movements at this angle.
So, software was responsible for:
1. Stopping too close.
2. Stopping in the blindspot of a truck.
3. Having no horn.
4. Not backing up or understanding fairly common motion by a truck.
> Now, it must be said that technically the robo-car was not at fault. It was struck by a semi that was backing up, and really just grazed — none of the passengers was hurt.
I get that it wasn't the Driverless car's fault, but this brings up an important use case that the driverless cars currently don't seem to be able to handle.
In an ALL-Human situation, what would've occurred is that the parked car (if it had a passenger inside) would honk at the car that is trying to back into it, or open the door and yell at the person trying to back up, and the accident would be avoided.
Driverless car doesn't (or didn't) honk even if it does detect something backing up into it. Hence the accident.
I fully expect every death caused by a self-driving cars to make national headlines in the next few years. Hysteria, whether justified or not, will set in and legislation will pass banning autonomous vehicles in the US.
Self-driving cars may have a faster reaction time, but they will never reach the level of human awareness of their surroundings while driving.
Let's see a self-driving car navigate through a construction zone, watch for instructions from a police officer who is directing traffic, or stop when kids are playing baseball in a yard and the ball rolls across the street. Answering, "well they'll have that capability someday" isn't a very compelling answer. Truly self-driving cars are dependent on technology that simply hasn't been invented yet.
How nice it will be sitting behind a fleet of self-driving cars dragging their asses down the highway at exactly the speed limit, or slamming on the brakes when a leaf flies in front of the sensors.
In addition to that, you think masses of people will be silent while losing their jobs because these robot overlords are taking the wheel?
Surely something as basic as typical traffic patterns in a town have been simulated? Because a vehicle coming onto you seems to be as basic and mundane an everyday traffic event as you will get.
It's in self driving proponents own best interest to have stringent standards, because if the public loses faith its going to be an uphill battle.
Simply demonizing human drivers and hand waving away errors is too self serving to work.
> A City of Las Vegas representative issued a statement that the shuttle “did what it was supposed to do, in that its sensors registered the truck and the shuttle stopped to avoid the accident.” It also claims, lamely, that “Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.”
Note a subtle shift, with government now shilling for the driverless vehicles. That's the first time I've seen it; I suspect it won't be the last.
These crashes seem to be caused by the "weird" way self-driving machines behave as compared to humans.
Perhaps we need some sort of placard that is legally mandated and easily visible (like the "student driver" placard) to let people around these vehicles be aware of them and expect different behaviors than "ordinary" divers.
If anything this just highlights to me how much commercial vehicles need self driving tech (I guess I, lamely, agree with the mayor).
My office used to be in an industrial district. The trucks there scare me. They absolutely do not follow the rules of the road, and it's dangerous for everybody around them.
Stuff like this: just backing up and expecting everybody to move out of their way, or taking a turn too tight and expecting everybody at the light to back up, were almost daily occurrences.
Yeah, self drivers need to account for this, but the bigger problem IMHO is getting those trucks and their drivers either off of the road or in compliance with driving laws.
Hard to judge this without seeing it. A human driver might of been able to anticipate what the truck was eventually going to do sooner so as to find a better place to stop. A human driver understands the difference between a truck backing blind and other vehicles.
[+] [-] rjmunro|8 years ago|reply
I think there are too many possible risks of programming the vehicle to do anything except stop dead in an emergency situation - you rapidly get into very complex programming with all kinds of failure modes - e.g. what if the problem was a faulty sensor? The vehicle might try to avoid a non-existent threat by crashing into something else. Stopping is the safe thing to do. If you want to react, you rapidly have to make lots of moral decisions like (The Trolley Problem).
Briefly sounding a horn in an unexpected emergency stop situation is probably a good idea, particularly if it's a white noise type thing, rather than a siren, so that other road users can localise it's source quickly.
[+] [-] pbhjpbhj|8 years ago|reply
Aside: emergency vehicles in the UK have stopped using the broad-band noise sirens AFAICT, I don't know why but the reason may be pertinent to any attempts to use it here? Actually, they often ride without the siren at all, which is completely ridiculous to me; they put it on at the junction meaning there's no time to get out of the way, if they used it continually you can hear them coming.
[+] [-] Fricken|8 years ago|reply
So Navya, the French company who has deployed several of these slow moving autonomous shuttle busses does not appear to have such sophisticated behavioral algorithms, in spite of entertaining ambitions of building proper robotaxis.
[+] [-] forapurpose|8 years ago|reply
Perhaps that limitation should disqualify AI from driving cars.
[+] [-] Turing_Machine|8 years ago|reply
To name just two potential issues that the "bus should have moved" people are ignoring:
1) What if there were pedestrians around?
2) What if moving the bus created an incursion into a stream of fast-moving traffic without warning?
Both of those could result in consequences far worse than the (apparently minor) fender-bender that actually occurred.
I agree that sounding the horn would be a good idea.
[+] [-] gvb|8 years ago|reply
"What Makes for a Street Legal Vehicle?"
Horn – It may not seem the most important piece of safety equipment, and many big cities even limit how it can be used, but to be street legal every vehicle must have a horn that is audible for at least 200 feet. The horn can generally be any note or sound (even ones that play musical tunes are usually permitted), so long as the minimum volume requirements are met.
Ref: https://www.hg.org/article.asp?id=31563
[+] [-] ajnin|8 years ago|reply
[+] [-] pbhjpbhj|8 years ago|reply
Sarcasm aside, who is deemed to be in control, who do you sue for the damage if this vehicle crashes in to you - isn't it the manufacturer that causes the crash (not in this case necessarily, I'm talking generally).
[+] [-] odammit|8 years ago|reply
Title should be NO_TITLE because “doofus driving truck backs into something” isn’t news.
[+] [-] random023987|8 years ago|reply
In fact, this article could start a discussion on if the risks of active accident avoidance measures (e.g. running a stop light at a clear intersection if you're about to be rear-ended) might cause more problems than they solve.
[+] [-] eloff|8 years ago|reply
Clearly this company's tech isn't up to par.
[+] [-] crankylinuxuser|8 years ago|reply
The title needs to be changed to, "Human in truck, backs up and hits vehicle (self-driving)" ?
Because that's what happened. A human caused a wreck. Doesn't seem as scare-mongery now, eh?
A persistent vision system, like 360 radar would have been fully aware of a vehicle at any direction, and would have not hit them. A human on the other hand.....
[+] [-] ksk|8 years ago|reply
A driverless car should be on par with the _BEST_ human driver, not the average driver, who as we know, gets into 6 million auto accidents per year.
[+] [-] ben_jones|8 years ago|reply
Which leads to another question. Will driverless cars report bad drivers to police? Making tech companies police informants... will laws be pass forcing this behavior?
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] carapace|8 years ago|reply
[+] [-] draw_down|8 years ago|reply
[+] [-] CodeWriter23|8 years ago|reply
[+] [-] crazygringo|8 years ago|reply
It's easy to say the car should be smart enough to move -- but what if, as it moves in one direction, the object (like a truck trying to avoid the car) suddenly swerves in that direction too? Then does the car become responsible for the collision?
And of course, it feels like there could be a real-world version of the trolley problem [1] -- what if there are 5 occupants in the vehicle who will be killed by an oncoming truck, but in the only direction where it can move out of the way, it will have to run over a single pedestrian?
Glad I'm not the one having to make these kinds of programming decisions.
[1] https://en.wikipedia.org/wiki/Trolley_problem
[+] [-] dkarl|8 years ago|reply
Also, this kind of story is new and interesting right now, so people will pay attention to the details, they'll make up their mind how to feel about later stories that they won't bother to read because it will be boring by then. It's amazing luck for the company to get in a "hit while stationary" story right at the get-go to help groove the narrative that driverless cars are safer than human drivers.
From a public perception standpoint it would be so helpful (dons tinfoil hat) that you could almost imagine them arranging some accidents of that kind. I think it would be a dumb chance to take, so I don't believe that's what happened, but I bet they're fist-pumping over their good luck.
[+] [-] pjc50|8 years ago|reply
- the knowledge of the world held by cars is partial and probabilistic. As it is with real humans, only with very different obscure failure modes (see for example the "adversarial images").
- if you get into this situation, something has already failed. Either in your safety practice or someone else's. That reduces the predictability of the situation and also the predictability of what your actions will do in that case.
Put those two together and you realise that any kind of "deliberate" (in the sense that one can impart "intent" to a computer system?) running over of a pedestrian is indefensible, and that real situations don't admit of neat counter-factuals like this.
[+] [-] CydeWeys|8 years ago|reply
[+] [-] maxerickson|8 years ago|reply
[+] [-] cadab|8 years ago|reply
[1] http://www.radiolab.org/story/driverless-dilemma/
[+] [-] Cthulhu_|8 years ago|reply
There should be an emergency exit option on cars like this though.
[+] [-] smsm42|8 years ago|reply
Nothing. Whoever is controlling that moving object is responsible. So, provided the car is not stopped on railroad track (in which case - what they heck is it doing stopping on railroad track?!) it should just stay there and not add variables to the situation by jerking around.
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] dzdt|8 years ago|reply
[+] [-] pfortuny|8 years ago|reply
[+] [-] michaelbuckbee|8 years ago|reply
[+] [-] radiorental|8 years ago|reply
The Machine acted in a way that was not expected by the human driver.
This gets to a fundamental issue we will have in the transition to the utopian driverless future. Learning to identify and react to this new class of driver.
[+] [-] bo1024|8 years ago|reply
[+] [-] dominotw|8 years ago|reply
[+] [-] Digory|8 years ago|reply
A reasonable truck driver would assume the driver of the other car would back up a little bit. But in this case, the car stopped where the truck couldn't see whether it was backing up or not, and wasn't programmed to understand the truck's movements at this angle.
So, software was responsible for: 1. Stopping too close. 2. Stopping in the blindspot of a truck. 3. Having no horn. 4. Not backing up or understanding fairly common motion by a truck.
[0] https://www.engadget.com/2017/11/09/las-vegas-self-driving-s...
[+] [-] wavefunction|8 years ago|reply
[+] [-] lsaferite|8 years ago|reply
https://www.digitaltrends.com/cars/self-driving-bus-crash-ve...
[+] [-] justboxing|8 years ago|reply
I get that it wasn't the Driverless car's fault, but this brings up an important use case that the driverless cars currently don't seem to be able to handle.
In an ALL-Human situation, what would've occurred is that the parked car (if it had a passenger inside) would honk at the car that is trying to back into it, or open the door and yell at the person trying to back up, and the accident would be avoided.
Driverless car doesn't (or didn't) honk even if it does detect something backing up into it. Hence the accident.
[+] [-] throwaway929292|8 years ago|reply
Self-driving cars may have a faster reaction time, but they will never reach the level of human awareness of their surroundings while driving.
Let's see a self-driving car navigate through a construction zone, watch for instructions from a police officer who is directing traffic, or stop when kids are playing baseball in a yard and the ball rolls across the street. Answering, "well they'll have that capability someday" isn't a very compelling answer. Truly self-driving cars are dependent on technology that simply hasn't been invented yet.
How nice it will be sitting behind a fleet of self-driving cars dragging their asses down the highway at exactly the speed limit, or slamming on the brakes when a leaf flies in front of the sensors.
In addition to that, you think masses of people will be silent while losing their jobs because these robot overlords are taking the wheel?
Source: I work for a self-driving car startup.
[+] [-] throw2016|8 years ago|reply
It's in self driving proponents own best interest to have stringent standards, because if the public loses faith its going to be an uphill battle.
Simply demonizing human drivers and hand waving away errors is too self serving to work.
[+] [-] forapurpose|8 years ago|reply
Note a subtle shift, with government now shilling for the driverless vehicles. That's the first time I've seen it; I suspect it won't be the last.
[+] [-] noonespecial|8 years ago|reply
Perhaps we need some sort of placard that is legally mandated and easily visible (like the "student driver" placard) to let people around these vehicles be aware of them and expect different behaviors than "ordinary" divers.
[+] [-] unityByFreedom|8 years ago|reply
Apparently, simply having the ability to stop does not make a self driving car better than a human. It should evade, or at least honk.
[+] [-] d0100|8 years ago|reply
What kind of writing is that?
[+] [-] Overtonwindow|8 years ago|reply
[+] [-] blhack|8 years ago|reply
If anything this just highlights to me how much commercial vehicles need self driving tech (I guess I, lamely, agree with the mayor).
My office used to be in an industrial district. The trucks there scare me. They absolutely do not follow the rules of the road, and it's dangerous for everybody around them.
Stuff like this: just backing up and expecting everybody to move out of their way, or taking a turn too tight and expecting everybody at the light to back up, were almost daily occurrences.
Yeah, self drivers need to account for this, but the bigger problem IMHO is getting those trucks and their drivers either off of the road or in compliance with driving laws.
[+] [-] smsm42|8 years ago|reply
[+] [-] exabrial|8 years ago|reply
[+] [-] upofadown|8 years ago|reply
[+] [-] BanzaiTokyo|8 years ago|reply
[+] [-] CodeWriter23|8 years ago|reply