There's at least two other trends that are, IMO, more to blame. SUVs in general for one, and the proliferation of particular mixed-use road antipattern.
SUVs is an easy one. When an SUV hits a bicyclist, motorcyclist, or pedestrian, the unfortunate person tends to go through rather than the over they'd be subjected to by a sedan. Motorcycle accident stats bear this out - the overall accident rate has gone down as people have started riding safer, but the fatality rate has gone up as SUVs have taken over market share.
The roads this is a bit more complicated, but basically roads need to either be slow enough to safely share space (25mph or below), separated out so that only cars can use it (freeways), or kill an alarming number of pedestrians and bicyclists. Four lanes and a 35 MPH speed limit with infrequent crossings is pretty much going to have a body count.
I'm not trying to excuse Uber here, just trying to maybe convince urban planners to stop building things that convince people to try to cross four lanes of traffic going 35 MPH.
Some of the commenters here think they can do a better job reconstructing the incident from newspaper cartoons than actual investigators who are working at the scene. You guys can just as well make up a story in which a murderous Uber robot chased a pedestrian off of the sidewalk and onto the road and then intentionally ran her over.
Why are we resorting to reconstructing a scene when Uber's Robot caught the whole thing on camera and LIDAR? The police statement seems quite premature, especially when the NTSB is running an investigation into the accident.
I think part of the problem is that almost all of the news articles have painted a picture of the fault being the biker. And there seems to be no one defending her and its ridiculous that the news articles are already one-sided here considering that this is a brand-spanking new type of thing that has happened on this planet.
That being said, I'm totally open to this being the fault of either Uber or the woman, but there are too many questions in my head that suggest it was the fault of the woman.
Are we saying that at this point we're going to let the initial police statement win because they are "experts" at driving. Are they also expects at the software in SDVs? I think at this point we shouldn't assume either story - but at the same time we can't let either Uber, the police, governments cover this story up for the sake of money and politics.
I think there are some pretty massive implications that should come from this - even if a human driver may have very well killed this person too (which I don't personally believe) don't we need to take a minute and ask a few questions about why the vehicle reportedly didn't even slow down after the hit? Why the dent is on the RIGHT hand side of the car (.2 seconds seems false)? Would a human that was driving slowed a lot earlier - was the pedestrian/biker expecting the car to slow because she clearly had a hand signal up - seeing a fake driver in the car?
There are too many questions and I hope that this specific story is treated with the utmost careful consideration. I for one can't seem to let it go in my head.
Someone died and it very well could have been because of our collective ego that we can accomplish this (SDVs) at this point in our history. We'll never hear from this woman again.
This is the first report I've seen that indicates the Uber AV was in the right lane. Remember that the victim was crossing from left to right, and that all of the visible damage on the AV is on the right-side bumper. It's very hard to imagine a scenario in which a 49-year-old woman walking her bike manages to cross 3 lanes of traffic so quickly that the Uber AV, moving at 40 mph, had no time to react, in a location with good street lighting and with clear weather. I would think most humans would be able to at least hit the brakes, if not completely avoid a collision.
I haven't seen anyone post this yet but I highly suspect the fact that she was pushing a bicycle impacted the machine learning driven AIs ability to determine she was a pedestrian. This reminds me of the kangaroos messing up the AV programs being tested in Australia. A human paying attention may have been wary of a person pushing a bike in the shadows but for all we know the algorithm thought that she was a bush or something because her profile was impacted by the bike. There is a lot to speculate about but machine learning isn't as smart as we humans tend to believe it is and nowhere does it yet approach the form of general intelligence required to respond appropriately to all of it's inputs the way a human paying attention could.
Weird scenarios that can't be predicted are important to consider. Waymo presentations like talking about a situation they ran into where the car needed to stop because it encountered an old woman in a wheelchair with a broom chasing a turkey. As I understand it, most of these companies sensibly say "unknown weird thing in/near road means stop the car."
Given that another comment says she "suddenly" crossed three lanes of traffic and was on the far side of the car when she was hit, my suspicion is that the car thought she was riding the bike. Of course all of us computers know that bikes always go over 5 mph so surely she'd make it across. Why brake?
I want safe self-driving cars, when they are safe. But Uber's clearly-established cavalier attitude toward human beings apparently can't be trusted with self-driving cars. I'm disappointed that politicians thought they could.
I don't understand this line of reasoning at all, but it's getting repeated a lot on HN. Is the AI expected to hit cyclists, but not pedestrians? Shouldn't it consider any moving object to be a hazard?
I don't see how this is relevant. The car should avoid hitting ANYTHING. It shouldn't have to recognise what the object is if it's in the path of the car.
When I was getting my learners permit I was practicing on a curvy mountain road. I saw a person with a camera phone on the "drop off" side of the road, taking a picture of from what my angle would've been a boring cliff face. But I knew that a camera phone photographer was probably actually photographing a human, who would likely be on the other side of the blind corner. So I slowed down, despite my instructor chastising me for doing so. When we came around and there was a person in the road she asked, "How did you know?" If a self driving vehicle can't make predictions about "irrational" pedestrian behaviour like this and slow down it shouldn't be on the road.
Your own story disproves your conclusion. Your driving instructor, like many (many) other drivers, didn't make the infererence that you did. People don't drive well. Cars are dangerous.
The criterion (really the only criterion) for whether or not automatic vehicles "should be on the road" is whether or not they are safer than the alternative. And that certainly doesn't include "being as good a driver as jdavis703 was in this particular anecdote".
The self driving accidents might be the easiest for humans to avoid, but if accident rates are lower for self-driving cars, it's still worth it. Let's look at overall numbers and severity, not how avoidable it would be for a human.
I wouldn't particularly blame your instructor, a lot of people would have reacted the same way, which to me proves that it's not just AI, it's the roads and the driving that are unsafe. As pedestrians, we learn what's dangerous and what we can't do, and as drivers, we assume pedestrians behavior from our own knowledge. But both AI and humans are subject to occasional 'irrational behavior'. Humans are not safe either in that situation. Even if in some cases a human might have advantage over AI, AI has many other advantages over humans. It would be a big mistake to reject AI just because there are situations where its performance can be inferior to some humans.
I agree there's still a lot of work to be done, but my point is that the problem is the driving system itself, and a lot of people dies because of it. AI can't directly change that, but if it can still do better than humans (in a few years), then it's worth it.
A proper (e.g. non-Uber) self driving car would never take a blind corner at a speed that doesn't allow it to come to a complete stop before it reaches its vision range.
As a side note, I find - as always - this part disturbing:
>The driver, Rafael Vasquez, 44, served time in prison for armed robbery and other charges in the early 2000s, according to Arizona prison and Maricopa County Superior Court records.
He served time in prison for armed robbery more than 15 years ago, what kind of relevance would have this?
Presumably he has a valid driving license issued by the State, and was not under the effect of alcohol or drugs.
Having committed armed robbery has seemingly no connection (I mean it isn't like he was condemned for having killed someone while driving a car or something like that), and even if that was the case some State must have issued (or renewed) his driving license, meaning that he was legally authorized to drive the car.
I imagine they're trying to imply that someone who was convicted of a serious crime could still be an unreliable, irresponsible, or untrustworthy person.
It's another question whether it's fair to assume that. I don't think it is, though personally I think it's reasonable to take the information into account as long as you restrain yourself from jumping to any conclusions.
Call me crazy but isn't a big part of the promise of these systems supposed to be they see things humans wouldn't or couldn't? If they're just as surprised as the human behind the wheel, that feels like a problem.
Yes, the report you'd hope to be hearing from Uber is along the lines of "0.2 seconds after the pedestrian entered the travel lane the automated system identified her as a hazard and initiated an avoidance maneuver. Despite braking and beginning to swerve away, the car impacted the pedestrian 1.2 seconds later." Thats the promise the automated vehicle people are selling. But in this case it seems the car was clueless.
If the woman suddenly walked in front of the car at the last minute, I don't see how the situation could have been avoided, no matter how fast a computer is able to process the information from the sensors.
There was also a driver behind the wheel, so it would suggest that humans are just as clueless.
I'm really skeptical about swerving as an evasive maneuver. It's complicated to get that right. But yes, if it didn't at least slam the brakes then something went badly wrong.
Maybe you can't prevent an accident, but you can at least reduce momentum and demonstrate that your system was working correctly.
Not in the field but I am assuming that the AI/ML cannot handle the ability to see possible road hazards with in the human peripheral vision area
Such as a kid attempting to chase down a ball that is rolling towards the road (object vector path collision), specially with the ball and or kid suddenly hidden from view because of a parked car (real environment vs visual environment). Or a group playing basketball in a driveway. Both where slowing down is always the safer bet.
A 4000 lb SUV, traveling nearly 40mph, at night, on a 4 lane divided highway, hits a pedestrian walking a bike.
The kinetic energy mismatch is the real problem, and at the very least, these companies should be testing at only 20-25mph, with _much_ lighter vehicles.
We'll have to wait for the NTSB to check in, but I'd be surprised if Uber isn't shut down(at least in Tempe) for a good long while.
Agreed. While I can understand it's much easier to strap sensors/equipment on a already available car, it makes more sense to me from a safety point of view to test with some lightweight shell of a vehicle to cause the least damage to other entities.
Oh it's her fault she was jay walking and the robots now take precedence. No need to ticket jay walkers! Uber's fleet will take care of these law breakers!
Ridiculous and this company after all it's done is still around and now it's killing people!
> "The driver said it was like a flash, the person walked out in front of them," Moir said, referring to the back-up driver who was behind the wheel but not operating the vehicle. "His first alert to the collision was the sound of the collision."
I cannot imagine the detailed logging the engineers might have to do in such a system. When I code I wonder sometimes if I am logging unnecessary events at info level.
This led to one more question. Do driverless cars have (or will have) a black box like that of aeroplane?
It's interesting how high profile this post-crash analysis is - name another time you read so much commentary about the details that caused a car crash?
It seems to me that this is exposing a few gaps in how we think about driverless cars currently:
- A framework for how cars should be making "moral" decisions (the trolley problem [0])
- A defined process for post car crash investigations - akin to the process in air crashes
Will be interesting to see if these emerge soon (or are emerging and I have missed)
Self-driving cars are still a fantasy. I don't want the AI to be comparable to an average driver (who collectively get into 6 million accidents) . I want the AI to meet/exceed the skills of the best driver. There is no way anyone would trust an "average" driver to pickup their kids, more than themselves.
> There is no way anyone would trust an "average" driver to pickup their kids, more than themselves.
I mean, most parents will—at some point—trust their teenage children to pick up their younger siblings. And teenagers are decidedly below-average drivers.
The average driver is pretty terrible though. Average is a pretty low bar to clear, and I think there's a decent argument to be made that waymo's disengagement numbers are already better than an average driver.
Did the car stop itself after the accident? Are autonomous cars programmed with a "we've just hit something, stop and pull over" mode? Which sensors on the car even know if it has hit something?
[+] [-] ThrustVectoring|8 years ago|reply
SUVs is an easy one. When an SUV hits a bicyclist, motorcyclist, or pedestrian, the unfortunate person tends to go through rather than the over they'd be subjected to by a sedan. Motorcycle accident stats bear this out - the overall accident rate has gone down as people have started riding safer, but the fatality rate has gone up as SUVs have taken over market share.
The roads this is a bit more complicated, but basically roads need to either be slow enough to safely share space (25mph or below), separated out so that only cars can use it (freeways), or kill an alarming number of pedestrians and bicyclists. Four lanes and a 35 MPH speed limit with infrequent crossings is pretty much going to have a body count.
I'm not trying to excuse Uber here, just trying to maybe convince urban planners to stop building things that convince people to try to cross four lanes of traffic going 35 MPH.
[+] [-] KKKKkkkk1|8 years ago|reply
[+] [-] falcolas|8 years ago|reply
[+] [-] coding123|8 years ago|reply
That being said, I'm totally open to this being the fault of either Uber or the woman, but there are too many questions in my head that suggest it was the fault of the woman.
Are we saying that at this point we're going to let the initial police statement win because they are "experts" at driving. Are they also expects at the software in SDVs? I think at this point we shouldn't assume either story - but at the same time we can't let either Uber, the police, governments cover this story up for the sake of money and politics.
I think there are some pretty massive implications that should come from this - even if a human driver may have very well killed this person too (which I don't personally believe) don't we need to take a minute and ask a few questions about why the vehicle reportedly didn't even slow down after the hit? Why the dent is on the RIGHT hand side of the car (.2 seconds seems false)? Would a human that was driving slowed a lot earlier - was the pedestrian/biker expecting the car to slow because she clearly had a hand signal up - seeing a fake driver in the car?
There are too many questions and I hope that this specific story is treated with the utmost careful consideration. I for one can't seem to let it go in my head.
Someone died and it very well could have been because of our collective ego that we can accomplish this (SDVs) at this point in our history. We'll never hear from this woman again.
[+] [-] dildo_fingers|8 years ago|reply
[deleted]
[+] [-] danso|8 years ago|reply
This is the first report I've seen that indicates the Uber AV was in the right lane. Remember that the victim was crossing from left to right, and that all of the visible damage on the AV is on the right-side bumper. It's very hard to imagine a scenario in which a 49-year-old woman walking her bike manages to cross 3 lanes of traffic so quickly that the Uber AV, moving at 40 mph, had no time to react, in a location with good street lighting and with clear weather. I would think most humans would be able to at least hit the brakes, if not completely avoid a collision.
[+] [-] robotbikes|8 years ago|reply
[+] [-] CobrastanJorji|8 years ago|reply
ooo, there's video of the turkey wheelchair broom chase: https://www.theguardian.com/technology/video/2017/mar/16/goo...
[+] [-] jd75|8 years ago|reply
I want safe self-driving cars, when they are safe. But Uber's clearly-established cavalier attitude toward human beings apparently can't be trusted with self-driving cars. I'm disappointed that politicians thought they could.
[+] [-] lulmerchant|8 years ago|reply
[+] [-] ClassyJacket|8 years ago|reply
[+] [-] jdavis703|8 years ago|reply
[+] [-] ajross|8 years ago|reply
The criterion (really the only criterion) for whether or not automatic vehicles "should be on the road" is whether or not they are safer than the alternative. And that certainly doesn't include "being as good a driver as jdavis703 was in this particular anecdote".
[+] [-] LethargicStud|8 years ago|reply
[+] [-] cle|8 years ago|reply
[+] [-] slx26|8 years ago|reply
I agree there's still a lot of work to be done, but my point is that the problem is the driving system itself, and a lot of people dies because of it. AI can't directly change that, but if it can still do better than humans (in a few years), then it's worth it.
[+] [-] 05|8 years ago|reply
[+] [-] jaclaz|8 years ago|reply
>The driver, Rafael Vasquez, 44, served time in prison for armed robbery and other charges in the early 2000s, according to Arizona prison and Maricopa County Superior Court records.
He served time in prison for armed robbery more than 15 years ago, what kind of relevance would have this?
Presumably he has a valid driving license issued by the State, and was not under the effect of alcohol or drugs.
Having committed armed robbery has seemingly no connection (I mean it isn't like he was condemned for having killed someone while driving a car or something like that), and even if that was the case some State must have issued (or renewed) his driving license, meaning that he was legally authorized to drive the car.
[+] [-] lancepioch|8 years ago|reply
[+] [-] adrianmonk|8 years ago|reply
It's another question whether it's fair to assume that. I don't think it is, though personally I think it's reasonable to take the information into account as long as you restrain yourself from jumping to any conclusions.
[+] [-] alistproducer2|8 years ago|reply
[+] [-] dwighttk|8 years ago|reply
[+] [-] dzdt|8 years ago|reply
[+] [-] branchan|8 years ago|reply
There was also a driver behind the wheel, so it would suggest that humans are just as clueless.
[+] [-] TillE|8 years ago|reply
Maybe you can't prevent an accident, but you can at least reduce momentum and demonstrate that your system was working correctly.
[+] [-] xbeta|8 years ago|reply
Please remove the software mindset when looking into this.
[+] [-] yndoendo|8 years ago|reply
Such as a kid attempting to chase down a ball that is rolling towards the road (object vector path collision), specially with the ball and or kid suddenly hidden from view because of a parked car (real environment vs visual environment). Or a group playing basketball in a driveway. Both where slowing down is always the safer bet.
[+] [-] BurningFrog|8 years ago|reply
Someone is dead and there may well be one or more trials as a result. Facts will take a s long as they take to surface.
[+] [-] Jackalopiate|8 years ago|reply
[+] [-] blendo|8 years ago|reply
The kinetic energy mismatch is the real problem, and at the very least, these companies should be testing at only 20-25mph, with _much_ lighter vehicles.
We'll have to wait for the NTSB to check in, but I'd be surprised if Uber isn't shut down(at least in Tempe) for a good long while.
[+] [-] turtlebits|8 years ago|reply
[+] [-] paul7986|8 years ago|reply
Ridiculous and this company after all it's done is still around and now it's killing people!
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] petee|8 years ago|reply
[+] [-] akkat|8 years ago|reply
[+] [-] samcampbell|8 years ago|reply
[+] [-] sudeepj|8 years ago|reply
This led to one more question. Do driverless cars have (or will have) a black box like that of aeroplane?
[+] [-] AngeloAnolin|8 years ago|reply
[+] [-] jaweb|8 years ago|reply
It seems to me that this is exposing a few gaps in how we think about driverless cars currently:
Will be interesting to see if these emerge soon (or are emerging and I have missed)[0] https://qz.com/1204395/self-driving-cars-trolley-problem-phi...
[+] [-] rhacker|8 years ago|reply
1. She was walking from left to right.
2. The dent was on the right side of the car
She probably knew she was going to be hit if she sped up like that, AND... it was probably more than .2 seconds of total visibility to the SDV.
[+] [-] ksk|8 years ago|reply
[+] [-] derefr|8 years ago|reply
I mean, most parents will—at some point—trust their teenage children to pick up their younger siblings. And teenagers are decidedly below-average drivers.
[+] [-] rconti|8 years ago|reply
[+] [-] notatoad|8 years ago|reply
[+] [-] ACow_Adonis|8 years ago|reply
On average...
Unless they're picking up their kids from lake wobegong elementary or something...
[+] [-] asdsa5325|8 years ago|reply
[+] [-] nikofeyn|8 years ago|reply
[+] [-] sgustard|8 years ago|reply
[+] [-] phyzome|8 years ago|reply
[+] [-] mindslight|8 years ago|reply
flagged.