From a strategy point Uber is in a desperate position. After the Google lawsuit their self-driving story is not working. Their existing business is loosing money in most places with no end in sight. Top level executives are leaving which is a sign that they can't imagine a positive course in the medium term.
The company is under extreme pressure and has proven over and over again to be willing to cut corners when it comes to legal compliance. It should be frightening to anyone on the streets that Uber managers are making decisions to put self driving cars on the road. They neither have the culture, patience nor the expertise to make decisions affecting life and death.
I'm surprised we don't hear more about Mercedes. I don't know how much of a story they have when it comes to self-driving car, but they already build cars, and they already have a well established rent-by-the-minute business: car2go.com.
If and when they figure out self-driving cars all they have to do is slowly replace the people driven car2go by self-driven ones and they'll become a leader of that market without any fuss!
>It should be frightening to anyone on the streets that Uber managers are making decisions to put self driving cars on the road.
Something that probably should have been in the article's title that this accident occurred because the human driver in the other car failed to yield. While Uber is certainly a very flawed company, neither its technology nor its people were at fault here.
The worst part of Uber's situation is that they have priced themselves out of any reasonable path to further fund raising. Even a successful IPO would be a huge down round.
I've never understood how valuations computed from white powder are strategically in the best interest of... anyone? Maybe someone more clueful about the nuances of VC can enlighten me. Even if Uber can make the Google suit and the harassment culture go away, I still don't see how they can ever justify that valuation. Every car company is about to ship (or has) some form of auto drive and there are no strong barriers to competing with Uber on ride sharing. Transportation is a commodity.
Abusing legal gray areas is a fundamental part of their business model. They were able to position this in a positive 'move fast and break things' light. Now people are noticing things are broken and are fixing them.
For example, they were abusing an exception for businesses grossing <30K a year to not pay taxes in Canada. That law is now being reviewed.
As far as I can tell Uber's corporate culture is both self-interested, self-righteous, and entirely inconsiderate of the people and markets they affect. You can only piss in the pool so long before it catches up with you, and they don't have any other runway left.
We have some of the greatest minds in the world collaborating, and this is one of their primary ambassadors.
If Uber were a publicly traded company I'd be buying as much stock as possible right now. All I see is total hysteria over completely surmountable setbacks. People want your statements to feel true because they have beef with Uber's culture, business practice, etc. so they're bending their perception of every event to fit their model. This is a discussion thread about how a self-driving car that was, according to evidence so far, hit by another driver who failed to yield. And basically every comment is talking about how reckless and irresponsible the company is, and intimating that it was probably Uber's fault. Everyone is just having the conversation they want to have, regardless of how it contacts reality. "Their existing business is loosing [sic] money in most places with no end in sight." According to whom, you? Why should I be even slightly credulous that this company with massive market penetration and a product people want has no wherewithal to turn a profit?
This whole thread is pure platitude. "Kalanick and his bros are sociopaths". Sure. Keep operating in those terms and see where it gets you. I'm going to stay here on planet Earth, thank you.
"However, police said there was a passenger in the self-driving car. The person was behind the wheel but it's unclear whether they were controlling the SUV or not."
Here's the real question, do you trust Uber to do this (or really anything) the right way, or do you think they'll take whatever shortcuts they can? If someone finds a huge bug in the code and says, "Well, we need to completely rewrite this part or it's never going to work. The rewrite is going to take about 10 months." And someone else says "We can hack a patch in that'll work most of the time in a week.".
Which road is Uber going to take in this case? I know what their past performance has shown, they'll take the shortcut. Have they ever done things the hard way?
It seems to me the best way to build a case for autonomous driving would be for Uber, Lyft, etc to give incentives to drivers who have safety features like automatic emergency braking, then quietly build up a database of accident rate statistics. These services could be a great way to collect this sort of data and see which approaches improve safety.
I am skeptical of unmanned taxis for reasons having nothing to do with autonomous driving. What do you do if a passenger gets sick, passes out, starts acting violently, has a heart attack, all things regular taxis encounter. Basically, you need a person to monitor the vehicle, passenger and contents even if the car is driving much of the time. Plus also, pleasant conversation if desired.
If the wrong way is safer than the current way I'll take the wrong way any day of the week and celebrate when someone comes along and competes with them by doing it the right way.
If you didn't click on the article, you wouldn't have learned that Uber hasn't finished its own investigation, and warrants the accident serious enough to suspend its own trials in both Arizona and Pittsburgh.
More details would be helpful. If this goes to court, the video from the self-driving vehicle would be interesting.
An important question is whether their system is smart enough to take evasive action.
It's becoming clear that there are two ways to approach self-driving. The first stems from the DARPA Grand Challenge, which was about off-road driving. For that, the vehicles had to profile the terrain, plotting a path around obstacles, potholes, and cliff edges. The GPS route was just general guidance on where to go. That's the approach Google took, as can be seen from their SXSW videos. Google also identifies moving objects and tries to classify them. With all that capability, it's possible to take evasive action if some other road user is a threat. The control system has situational awareness and knows where there's clear space for escape.
The other approach is to start with lane following and automatic cruise control, and try to build them up into self-driving. This can be done entirely with vision systems. That's the Cruise Automation and Tesla approach. This puts the car on a track defined by lines on the pavement, with lane changes and intersections handled as a special case. There usually isn't a full terrain profile; that requires LIDAR. So there isn't enough info to plan an emergency maneuver for collision avoidance.
This distinction is not well understood, and it should be.
In other news, statistics from 2014[1] suggest that since the autonomous car crash was reported yesterday (11:29pm EST, Mar24), 9 people were killed by human drivers under the influence of an intoxicant of some kind.
Autonomous vehicles will have challenges that human drivers do not -- like software vulnerabilities -- but those are problems worth overcoming if it means that poor decision making by humans can be reduced.
>Autonomous vehicles will have challenges that human drivers do not
Wait a minute: I don't think the argument is whether machines can drive better than humans. It's whether these companies that are desperate to make money (Uber, Tesla) are pushing potentially dangerous technology on to society before it's ready.
The much stronger defense in this case is to just cut to the chase and point out that, at least according to the police, the self-driving car was not the one at fault here - it was the driver of the other car blowing a yield sign.
Wait, drunk driving is a willfully criminal act how is that comparable or relevant here?
We already have things like taxis, Uber and Lyft and people still get drunk and drive under the influence of alcohol and kill people.
Why would ride services with autonomous vehicles change that?
If someone is going to get drunk and drive a car rather than calling an Uber with a human driving they are no more likely to get drunk and call Uber because Uber has driverless vehicles.
The target for self driving cars is ride services not individuals.
The real issue here - yet to be resolved at this time - is whether Uber is going about its development of autonomous cars responsibly, as Waymo, for example, appears to be doing. Generalities offer no answer to that question.
People say Uber needs self-driving cars in order to survive. But how realistic is it that we have self-driving cars without a backup human operator anytime soon? If someone needs to be behind the wheel anyway I don't see how it's cutting any costs.
Who knows. But it seems damm sure that the first successful use case isn't going to be a self driving taxi if they do ever exist. It's arguably the most difficult of the common use cases. Uber's strategy seems literally insane.
>But how realistic is it that we have self-driving cars without a backup human operator anytime soon?
The whole, "in case of emergency, grab the wheel" is just a legal formality. If you are not actively driving you won't be in a state of mind to be able to do that.
I think eventually there will be enough data to show that a human behind the wheel does absolutely nothing to prevent a crash.
I think safety- and brand-conscious Volvo is going to deeply regret this partnership with Uber. (Where Uber provides the self-driving logic and Volvo provides the the vehicle itself.)
I can only imagine the internal struggles in Volvo right now. At the very least, if I were running Volvo Cars I'd ask Uber to replace the Volvo logos on those cars with Uber logos...
(Volvo has their own, quite advanced self-driving program, but they seem to be doing it the correct/cautious way.)
I can't find any details of the accident. Every headline is the same - Uber self-driving vehicle involved in Arizona crash. You read the article - and no details whatsoever. So why bother reporting it? The news isn't that the accident happen, but why it happened.
That's "news" for the majority of publishers in 2017: Clickbait headline designed to generate traffic to sell ads. Actual journalism and content becomes secondary.
Let's put ourselves in the shoes of someone who reads this headline and gets a quick summary. This headline makes it sound like Uber isn't confident of its technology and they think their technology might be the cause and hence they are suspending it. In this case, it seems pretty clear, as confirmed by the police officer, that Uber's car wasn't at fault.
More importantly, those tweets purport that the other car (driven by a human) failed to yield at the intersection and hit Uber's car. If this proves to be true, the only question is whether a human driver in the Uber car's position would have noticed and braked to avoid entering the intersection.
There's mention of a third car ... we'll see what the traffic cameras say!
So the thing I don't get is in this autonomous car future, is everyone time shifting to share these cars? Does this work in a factory town where everyone has to be in at shift change?
Or if I lease my car out when I'm not using it how do I not get my car when I really need it or want to go home sick?
I wonder if that is actually going to happen. Since going to school I'm aware that autonomous systems that "take over responsibility" are an unsolved ethical problem. Delegating this away to the manufacturer or maybe even the 3rd party who delivered the code doesn't make it better. Adding many layers to responsibility usually s*cks. At least for me it's also why I don't like corporates, it's not only that you have less control over your work - you don't even know who pushes the buttons. As it turns out, in many cases it's nobody concrete. It's some kind of weird group dynamic that is oftentimes in control. In case autonomous cars become common, analysing this kind of complexity will be an awesome kind of busywork.
Slinging psychiatric labels to score points in an internet argument breaks the HN guidelines against name-calling and reliably leads to (and is itself) low-quality discussion, so please don't do that here.
Edit: since we asked you not to post like this before and yout not only ignored us but have been doing almost nothing else, I've banned this account.
Why is this downvoted... a sociopath is someone who holds a complete utter disregard for everyone but themselves. Does that not describe all and any of Uber's actions or this is getting downvoted because your one or want to be his bro?
[+] [-] heisenbit|9 years ago|reply
The company is under extreme pressure and has proven over and over again to be willing to cut corners when it comes to legal compliance. It should be frightening to anyone on the streets that Uber managers are making decisions to put self driving cars on the road. They neither have the culture, patience nor the expertise to make decisions affecting life and death.
[+] [-] dorfsmay|9 years ago|reply
If and when they figure out self-driving cars all they have to do is slowly replace the people driven car2go by self-driven ones and they'll become a leader of that market without any fuss!
[+] [-] downandout|9 years ago|reply
Something that probably should have been in the article's title that this accident occurred because the human driver in the other car failed to yield. While Uber is certainly a very flawed company, neither its technology nor its people were at fault here.
[+] [-] api|9 years ago|reply
I've never understood how valuations computed from white powder are strategically in the best interest of... anyone? Maybe someone more clueful about the nuances of VC can enlighten me. Even if Uber can make the Google suit and the harassment culture go away, I still don't see how they can ever justify that valuation. Every car company is about to ship (or has) some form of auto drive and there are no strong barriers to competing with Uber on ride sharing. Transportation is a commodity.
How's Lyft doing?
[+] [-] abandonliberty|9 years ago|reply
For example, they were abusing an exception for businesses grossing <30K a year to not pay taxes in Canada. That law is now being reviewed.
As far as I can tell Uber's corporate culture is both self-interested, self-righteous, and entirely inconsiderate of the people and markets they affect. You can only piss in the pool so long before it catches up with you, and they don't have any other runway left.
We have some of the greatest minds in the world collaborating, and this is one of their primary ambassadors.
[+] [-] sebastos|9 years ago|reply
This whole thread is pure platitude. "Kalanick and his bros are sociopaths". Sure. Keep operating in those terms and see where it gets you. I'm going to stay here on planet Earth, thank you.
[+] [-] zouhair|9 years ago|reply
[+] [-] xyzzy_plugh|9 years ago|reply
From http://www.abc15.com/news/region-southeast-valley/tempe/temp...
[+] [-] mst|9 years ago|reply
Perhaps this should be considered further evidence that they didn't write their own system ...
[+] [-] mcguire|9 years ago|reply
[+] [-] bdavisx|9 years ago|reply
Which road is Uber going to take in this case? I know what their past performance has shown, they'll take the shortcut. Have they ever done things the hard way?
[+] [-] hijra|9 years ago|reply
If Uber is statistically safer without human drivers, then pushing it out 10 months early with bugs could save lives.
Humans aren't predisposed to trust science over their gut though.
[+] [-] rpmcmurphy|9 years ago|reply
I am skeptical of unmanned taxis for reasons having nothing to do with autonomous driving. What do you do if a passenger gets sick, passes out, starts acting violently, has a heart attack, all things regular taxis encounter. Basically, you need a person to monitor the vehicle, passenger and contents even if the car is driving much of the time. Plus also, pleasant conversation if desired.
[+] [-] hashkb|9 years ago|reply
[+] [-] toolz|9 years ago|reply
[+] [-] arikr|9 years ago|reply
Current title was "clickbait-y" enough for me to click, whereas the actual title of the article could save the click.
/u/dang?
[+] [-] doktrin|9 years ago|reply
[+] [-] vacri|9 years ago|reply
[+] [-] Animats|9 years ago|reply
An important question is whether their system is smart enough to take evasive action.
It's becoming clear that there are two ways to approach self-driving. The first stems from the DARPA Grand Challenge, which was about off-road driving. For that, the vehicles had to profile the terrain, plotting a path around obstacles, potholes, and cliff edges. The GPS route was just general guidance on where to go. That's the approach Google took, as can be seen from their SXSW videos. Google also identifies moving objects and tries to classify them. With all that capability, it's possible to take evasive action if some other road user is a threat. The control system has situational awareness and knows where there's clear space for escape.
The other approach is to start with lane following and automatic cruise control, and try to build them up into self-driving. This can be done entirely with vision systems. That's the Cruise Automation and Tesla approach. This puts the car on a track defined by lines on the pavement, with lane changes and intersections handled as a special case. There usually isn't a full terrain profile; that requires LIDAR. So there isn't enough info to plan an emergency maneuver for collision avoidance.
This distinction is not well understood, and it should be.
[+] [-] Mtinie|9 years ago|reply
Autonomous vehicles will have challenges that human drivers do not -- like software vulnerabilities -- but those are problems worth overcoming if it means that poor decision making by humans can be reduced.
----
[1] https://www.cdc.gov/mmwr/preview/mmwrhtml/mm6430a2.htm
[+] [-] nialv7|9 years ago|reply
[+] [-] prolly_a_moron|9 years ago|reply
Wait a minute: I don't think the argument is whether machines can drive better than humans. It's whether these companies that are desperate to make money (Uber, Tesla) are pushing potentially dangerous technology on to society before it's ready.
[+] [-] snovv_crash|9 years ago|reply
Seriously, I get the point you're trying to make, but at least try to find some data that actually makes sense for a comparison.
[+] [-] bunderbunder|9 years ago|reply
[+] [-] bogomipz|9 years ago|reply
We already have things like taxis, Uber and Lyft and people still get drunk and drive under the influence of alcohol and kill people.
Why would ride services with autonomous vehicles change that?
If someone is going to get drunk and drive a car rather than calling an Uber with a human driving they are no more likely to get drunk and call Uber because Uber has driverless vehicles.
The target for self driving cars is ride services not individuals.
[+] [-] mannykannot|9 years ago|reply
[+] [-] TenJack|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] madeofpalk|9 years ago|reply
[+] [-] Kiro|9 years ago|reply
[+] [-] CPLX|9 years ago|reply
[+] [-] lumberjack|9 years ago|reply
The whole, "in case of emergency, grab the wheel" is just a legal formality. If you are not actively driving you won't be in a state of mind to be able to do that.
I think eventually there will be enough data to show that a human behind the wheel does absolutely nothing to prevent a crash.
[+] [-] johansch|9 years ago|reply
I can only imagine the internal struggles in Volvo right now. At the very least, if I were running Volvo Cars I'd ask Uber to replace the Volvo logos on those cars with Uber logos...
(Volvo has their own, quite advanced self-driving program, but they seem to be doing it the correct/cautious way.)
[+] [-] taylodl|9 years ago|reply
[+] [-] weston|9 years ago|reply
It makes you appreciate actual journalism.
[+] [-] notblackNwhite|9 years ago|reply
[+] [-] cfetter|9 years ago|reply
[+] [-] smoyer|9 years ago|reply
There's mention of a third car ... we'll see what the traffic cameras say!
[+] [-] m-j-fox|9 years ago|reply
[+] [-] grogenaut|9 years ago|reply
Or if I lease my car out when I'm not using it how do I not get my car when I really need it or want to go home sick?
[+] [-] hackbinary|9 years ago|reply
Human error accounts for something like 95% of all crashes.
https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/...
http://cyberlaw.stanford.edu/blog/2013/12/human-error-cause-...
[+] [-] blablabla123|9 years ago|reply
[+] [-] hackbinary|9 years ago|reply
[+] [-] wallace_f|9 years ago|reply
On the other hand, car hacking is a credible threat. We already have a Michael Hastings incident.
[+] [-] throwaway872194|9 years ago|reply
If you're not risk averse, you'd dead meat in 10s in Bangalore - quite lit. if you're riding a motorcycle.
(Watch out for the yellow number plates and the Tata Indicas if you're ever here!)
[+] [-] good_vibes|9 years ago|reply
[+] [-] chaser7016|9 years ago|reply
[deleted]
[+] [-] dang|9 years ago|reply
Edit: since we asked you not to post like this before and yout not only ignored us but have been doing almost nothing else, I've banned this account.
We detached this subthread from https://news.ycombinator.com/item?id=13955355 and marked it off-topic.
[+] [-] chaser7016|9 years ago|reply
[+] [-] XJOKOLAT|9 years ago|reply
Thanks, Uber.