"Anthony Levandowski, the Otto founder who became head of Uber’s self-driving efforts last summer, also shares Kalanick’s brash disregard for regulatory authority. Bloomberg reported earlier this month that a running joke at Otto was “safety third.”
Uh oh.
The California DMV won't yet allow testing of autonomous vehicles heavier than 10,000 pounds on California public roads. Otto is thus testing in Arizona. Looks like that was a good decision by DMV.
I wrote this on HN yesterday:
An important question is whether their system is smart enough to take evasive action.
It's becoming clear that there are two ways to approach self-driving. The first stems from the DARPA Grand Challenge, which was about off-road driving. For that, the vehicles had to profile the terrain, plotting a path around obstacles, potholes, and cliff edges. The GPS route was just general guidance on where to go. That's the approach Google took, as can be seen from their SXSW videos. Google also identifies moving objects and tries to classify them. With all that capability, it's possible to take evasive action if some other road user is a threat. The control system has situational awareness and knows where there's clear space for escape.
The other approach is to start with lane following and automatic cruise control, and try to build them up into self-driving. This can be done entirely with vision systems. That's the Cruise Automation and Tesla approach. This puts the car on a track defined by lines on the pavement, with lane changes and intersections handled as a special case. There usually isn't a full terrain profile; that requires LIDAR. So there isn't enough info to plan an emergency maneuver for collision avoidance.
This distinction is not well understood, and it should be.
Maybe I'm misunderstanding, but your last paragraph seems to imply that the "Tesla approach" is unable to "plan an emergency maneuver for collision avoidance"
If true, that would seem to mean that Tesla's self-driving car program is dead on arrival. Clearly emergency collision avoidance is not optional for a self driving car with mass adoption. That's a pretty serious claim to make.
>Bloomberg reported earlier this month that a running joke at Otto was “safety third.”
"Safety third" is a burning man-ism. It's kindof complex to even explain the context for this, but it definitely doesn't mean what it sounds like.
An example of this would be: my friends and I build fire effects. Some of the guys on my team are professional propulsion engineers at a company you have heard of, and they build rockets. So when we're building little flame throwers, and they say "safety third", they're saying it as joke, yeah, but their version of "safety third" is "This is only 20x overbuilt, not 50x".
To somebody not familiar with how that term is used, it definitely does sound bad, though.
> Recode also obtained documents showing that Uber’s self-driving cars currently need to be handled by their human safety drivers roughly once every 0.8 miles.
Uber is that bad? That's terrible. They shouldn't be off the test track yet. They should be testing at GoMentum Station, which is the former Concord Naval Weapons Station north of Oakland. That place was built to store battleship ammo; if anything bad happens there, there's nobody close enough to get hurt.
Google/Waymo's California DMV disengagement report for 2016 reports 0.20 disengagements per 1000 miles, or one disengagement per 5000 miles.[1] Google's cars drove 635,868 miles autonomously in 2016, so there's enough driving behind this for that number to be meaningful. This is 4x better than 2015, incidentally. CA DMV accident and disengagement reports are worth reading, since they're one of the few objective data sources available on automatic driving. So far, nobody has been willing to turn a self-driving car over to Top Gear or Road and Track.
Google/Waymo is thus three orders of magnitude better than Uber, and about two orders of magnitude better than anybody else who tests in California.
Is anyone interested in making a public bet with me about autonomous vehicles? I'm thinking something like, we publicly state that within 5 years a self-driving car will be able to take me from my apartment in Boston to my work in Cambridge, regardless of weather, time of day, etc. If it can, I pay you $1000, if it can't, you pay me $1000.
Basically I'm getting super annoyed at people claiming self driving cars are right around the corner, which I've been reading for a decade, but it's always "just 5 years away". I would love to counter arguments with, "If you are so convinced why not bet me $1000?"
> will be able to take me from my apartment in Boston to my work in Cambridge, regardless of weather, time of day, etc.
Having driven from my apartment (at the time) in Boston to my job in Lexington going through Cambridge every day for almost two years... I will NOT take that bet and I'm pretty optimistic about self driving cars :P
But, they don't need to replace all driving overnight. They just need to replace a portion of driving to have a big impact.
Although, I often wonder if driving like muscles can become atrophied and if we end up with a situation like: "the car will drive me in fair whether but I need to drive myself when it snows"... if my body will forget how to drive by the time winter comes.
Thats difficult. People who talk about "self driving cars 5 years away" aren't usually talking about mythical level 5 self driving that is actually 40 years away. They are talking about level 3, which basically means, "end to end travel, in 90% of situations".
Just like how 1/10 times you might not be able to flag down a taxi cab, today, 1/10 times you won't be able to get the self driving taxi to go EXACTLY where you need to go.
But it will work for most situations, most of time, and thats good enough. (IE, imagine if 90% of taxi drivers or truck drivers were made obsolete. That is industry destroying)
I mean there's a rational financial reason why someone optimistic about self-driving cars would be hesitant to take your bet -- they're magnifying their risk wrt self-driving cars, so if the technology doesn't come to fruition they're doubly screwed. If anything they'd rather take the opposite side of the bet so they can hedge.
If I thought autonomous cars would be ready in 5 years, I'd be extremely hesitant to take that bet, specifically with regard to weather.
Personally, just around the corner means 10 years in the top 50% of weather. Operating in a blizzard may be very far away, and I'm fine with that. Would I take this bet? Probably not, but I'm passively optimistic.
Easier to pull the cars from the markets and have less attention be paid to the Waymo suit. Good, bad, meh reports will mention the lawsuit every time and that's a larger PR nightmare than this right now.
Makes sense to me. Every bit of "Uber self-driving car" press is going to mention that nasty Waymo lawsuit. At this point, Uber has got to be looking to stop the bleeding.
It's weird that there is no way for regulators to know how good a self driving car really is; there might be many hours of driving but it's not like an aeroplane where you can have coverage of most conditions. New conditions could happen all the while.
Add to the fact that Tesla have the richest dataset they have a huge advantage in this area one that will reap rewards and crash less that the competition. I think Uber and Volvo will be far behind.
What are the challenges behind creating a licensing test similar to the ones we take as humans(could be virtual)? This would be before allowing the software on the road.
Reading the comments below. Looks like a rush to judgement as I've seen other articles suggesting that the Uber car was bit by another driver who failed to yield. Should driverless cars be under scrutiny for every counterpartt human error.
I imagine this is a response to the awful press around the company the past several months. The Uber of 2015 would not have taken these cars off the road.
I think the odds that Uber would make money developing self driving cars was approximately zero. Look where Tesla Motors is after fourteen years in terms of producing and selling automobiles and without internal competition from another primary line of business.
Self driving vehicles will have the same profile as the existing automotive industry: high capital costs and commodity margins. The existing automotive mega-corporations already have the capital investment and distribution networks like ships and rail cars and lots and auto-carrier semi-trailers. Tesla is building some of that and the jury is still out on whether Tesla will make a meaningful dent.
Even the idea of Uber rolling out a self-driving fleet means a massive infusion of capital that is at odds with its current cost structure. And the existing automotive mega-corps can step into the ondemand business with a cost advantage in regard to rolling out vehicle fleets.
As much as I dislike Uber, I hate the uniformly negative tone the media has switched to now that it's the popular thing instead of singing Uber's praises far more. The same with the possibilities with autonomous driving. Like Theranos, it's not as if many of these issues weren't there for some time, but we've magically transitioned from them being visionary and daring to evil.
I think at this point it is clear that main problem with self-driving cars is not technology but social, psychological, political and legal challenges. People often forget that having good tech is just one step that does not guarantee any commercial success.
People dont change their habits just because new tech is available. People are not going to start buying or using self-driving cars just because tech is available. There is huge psychological and social environment around car driving - e.g. think about all legal paperwork you have to go through to start driving, learning how to drive, buying insurance, all traffic regulations. All this cultural/human factors are based on specific definition of what car is and they all rely there is human subject operating this car. If self-driving cars are really going to succeed all this huge extremely social framework would have to change. I don't think this will happen. I think there are simply too many strong human emotions and purely financial interests around cars.
[+] [-] dang|9 years ago|reply
[+] [-] Animats|9 years ago|reply
Uh oh.
The California DMV won't yet allow testing of autonomous vehicles heavier than 10,000 pounds on California public roads. Otto is thus testing in Arizona. Looks like that was a good decision by DMV.
I wrote this on HN yesterday:
An important question is whether their system is smart enough to take evasive action.
It's becoming clear that there are two ways to approach self-driving. The first stems from the DARPA Grand Challenge, which was about off-road driving. For that, the vehicles had to profile the terrain, plotting a path around obstacles, potholes, and cliff edges. The GPS route was just general guidance on where to go. That's the approach Google took, as can be seen from their SXSW videos. Google also identifies moving objects and tries to classify them. With all that capability, it's possible to take evasive action if some other road user is a threat. The control system has situational awareness and knows where there's clear space for escape.
The other approach is to start with lane following and automatic cruise control, and try to build them up into self-driving. This can be done entirely with vision systems. That's the Cruise Automation and Tesla approach. This puts the car on a track defined by lines on the pavement, with lane changes and intersections handled as a special case. There usually isn't a full terrain profile; that requires LIDAR. So there isn't enough info to plan an emergency maneuver for collision avoidance.
This distinction is not well understood, and it should be.
[+] [-] arglebarnacle|9 years ago|reply
If true, that would seem to mean that Tesla's self-driving car program is dead on arrival. Clearly emergency collision avoidance is not optional for a self driving car with mass adoption. That's a pretty serious claim to make.
[+] [-] blhack|9 years ago|reply
"Safety third" is a burning man-ism. It's kindof complex to even explain the context for this, but it definitely doesn't mean what it sounds like.
An example of this would be: my friends and I build fire effects. Some of the guys on my team are professional propulsion engineers at a company you have heard of, and they build rockets. So when we're building little flame throwers, and they say "safety third", they're saying it as joke, yeah, but their version of "safety third" is "This is only 20x overbuilt, not 50x".
To somebody not familiar with how that term is used, it definitely does sound bad, though.
[+] [-] M_Grey|9 years ago|reply
[+] [-] username223|9 years ago|reply
> Recode also obtained documents showing that Uber’s self-driving cars currently need to be handled by their human safety drivers roughly once every 0.8 miles.
That's really far from autonomous.
EDIT: Here's the Recode article:
https://www.recode.net/2017/3/24/14737438/uber-self-driving-...
and things just get worse. The cars only make it an average of 2 miles between incorrect sudden movements, down from 4 in January. What a joke.
[+] [-] Animats|9 years ago|reply
Google/Waymo's California DMV disengagement report for 2016 reports 0.20 disengagements per 1000 miles, or one disengagement per 5000 miles.[1] Google's cars drove 635,868 miles autonomously in 2016, so there's enough driving behind this for that number to be meaningful. This is 4x better than 2015, incidentally. CA DMV accident and disengagement reports are worth reading, since they're one of the few objective data sources available on automatic driving. So far, nobody has been willing to turn a self-driving car over to Top Gear or Road and Track.
Google/Waymo is thus three orders of magnitude better than Uber, and about two orders of magnitude better than anybody else who tests in California.
[1] https://www.dmv.ca.gov/portal/wcm/connect/946b3502-c959-4e3b...
[+] [-] seibelj|9 years ago|reply
Basically I'm getting super annoyed at people claiming self driving cars are right around the corner, which I've been reading for a decade, but it's always "just 5 years away". I would love to counter arguments with, "If you are so convinced why not bet me $1000?"
[+] [-] throwaway2016a|9 years ago|reply
Having driven from my apartment (at the time) in Boston to my job in Lexington going through Cambridge every day for almost two years... I will NOT take that bet and I'm pretty optimistic about self driving cars :P
But, they don't need to replace all driving overnight. They just need to replace a portion of driving to have a big impact.
Although, I often wonder if driving like muscles can become atrophied and if we end up with a situation like: "the car will drive me in fair whether but I need to drive myself when it snows"... if my body will forget how to drive by the time winter comes.
[+] [-] stale2002|9 years ago|reply
Thats difficult. People who talk about "self driving cars 5 years away" aren't usually talking about mythical level 5 self driving that is actually 40 years away. They are talking about level 3, which basically means, "end to end travel, in 90% of situations".
Just like how 1/10 times you might not be able to flag down a taxi cab, today, 1/10 times you won't be able to get the self driving taxi to go EXACTLY where you need to go.
But it will work for most situations, most of time, and thats good enough. (IE, imagine if 90% of taxi drivers or truck drivers were made obsolete. That is industry destroying)
[+] [-] nostromo95|9 years ago|reply
[+] [-] lojack|9 years ago|reply
Personally, just around the corner means 10 years in the top 50% of weather. Operating in a blizzard may be very far away, and I'm fine with that. Would I take this bet? Probably not, but I'm passively optimistic.
[+] [-] onewland|9 years ago|reply
[+] [-] blhack|9 years ago|reply
[+] [-] mrkgnao|9 years ago|reply
[+] [-] saosebastiao|9 years ago|reply
[+] [-] sharkweek|9 years ago|reply
Seems odd that they'd make this move unless it was to avoid more negative press for continuing the program.
[+] [-] brogrammernot|9 years ago|reply
[+] [-] samcheng|9 years ago|reply
[+] [-] andy_ppp|9 years ago|reply
Add to the fact that Tesla have the richest dataset they have a huge advantage in this area one that will reap rewards and crash less that the competition. I think Uber and Volvo will be far behind.
[+] [-] konceptz|9 years ago|reply
What are the challenges behind creating a licensing test similar to the ones we take as humans(could be virtual)? This would be before allowing the software on the road.
[+] [-] rdlecler1|9 years ago|reply
[+] [-] CptJamesCook|9 years ago|reply
[+] [-] brudgers|9 years ago|reply
Self driving vehicles will have the same profile as the existing automotive industry: high capital costs and commodity margins. The existing automotive mega-corporations already have the capital investment and distribution networks like ships and rail cars and lots and auto-carrier semi-trailers. Tesla is building some of that and the jury is still out on whether Tesla will make a meaningful dent.
Even the idea of Uber rolling out a self-driving fleet means a massive infusion of capital that is at odds with its current cost structure. And the existing automotive mega-corps can step into the ondemand business with a cost advantage in regard to rolling out vehicle fleets.
[+] [-] notyourwork|9 years ago|reply
[+] [-] andrewksl|9 years ago|reply
[+] [-] 4258HzG|9 years ago|reply
[+] [-] forthefuture|9 years ago|reply
[+] [-] awinter-py|9 years ago|reply
https://www.youtube.com/watch?v=RY93kr8PaC4
But that trademark Volvo understeer doesn't like to powerslide.
[+] [-] ffjffsfr|9 years ago|reply
People dont change their habits just because new tech is available. People are not going to start buying or using self-driving cars just because tech is available. There is huge psychological and social environment around car driving - e.g. think about all legal paperwork you have to go through to start driving, learning how to drive, buying insurance, all traffic regulations. All this cultural/human factors are based on specific definition of what car is and they all rely there is human subject operating this car. If self-driving cars are really going to succeed all this huge extremely social framework would have to change. I don't think this will happen. I think there are simply too many strong human emotions and purely financial interests around cars.