top | item 28266855

Waymo has lost its CEO and is still getting stymied by traffic cones

142 points| tim_sw | 4 years ago |bloomberg.com

216 comments

order
[+] dang|4 years ago|reply
All: please don't react just to the title, and especially not to the most sensational bit of a title. That's a reflexive response, not a reflective one; we want the latter here: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor....

Also, often we'll change such titles (as I just did above), and then the shallow-title-objections become a sort of uncollected garbage, referencing something that no longer exists.

I'm going to make a stub reply and bundle those comments underneath it so as to collapse them. There are more interesting things to discuss here.

[+] yalogin|4 years ago|reply
I used to believe in the self driving hype. Then I learned ML and started working in related areas. Now I know full self driving is not happening, not with the technology/algorithms we have. So what ever anyone does is just an approximation and we will never be able to trust the car will handle any situation thrown at it. So its always going to be a souped up cruise control, the likes of which Tesla and others are selling. We need some kind of breakthrough to get fully autonomous self driving where we can sleep in the car while it takes us to the destination.

I would, of course, love to hear if folks here think I am wrong.

[+] tinco|4 years ago|reply
So, genuinely curious, how can it happen that a system that's been developed over 12 years, can not deal with a common road obstruction? If something truly rare/dangerous happens I can imagine it not being able to deal. But it seems this sort of situation is not only easy to deal with, it's actually easy to predict.

Maybe not something you could predict in a product with just a couple years of time, with just a couple employees, but we're talking over a decade, and thousands of employees. Oh and in Phoenix Arizona, one of those ultra uniform grid cities the US is famous for.

What's stopping them from having their AI operate in simulations in literally every possible traffic situation? Every location a pot hole can be in, every type of traffic cone arrangement, every kind of jaywalking, every kind of vehicle approaching any kind of street from any kind of location, possible speed, reaction time, everything? How could a government approve any autonomous agent that is not tested in such a way?

My world view gets rocked a little everytime I see a headline like this. I want to believe these companies are ran by the smartest engineers on the planet, but then I read about a Tesla running into an emergency services vehicle, or a Waymo being confused about traffic cones and I can't help but wonder what the heck are they doing?

[+] qeternity|4 years ago|reply
> What's stopping them from having their AI operate in simulations in literally every possible traffic situation? Every location a pot hole can be in, every type of traffic cone arrangement, every kind of jaywalking, every kind of vehicle approaching any kind of street from any kind of location, possible speed, reaction time, everything? How could a government approve any autonomous agent that is not tested in such a way?

Because this approach has huge limitations, principally that you’re now optimizing against a cost function that isn’t the real world. RL has a place for sure, but limitations.

It’s like saying that you’d expect someone who’s played 10k hours of Call of Duty to be equivalent to a Navy Seal. They’re just really good at playing the simulation, not the real thing.

[+] dham|4 years ago|reply
It's because they're trying to code the problem instead of solving it end to end. As you can imagine if you're trying to code driving scenarios you're never going to finish. Only Tesla and Comma.ai are on the right track. Think Alpha Zero vs Stockfish.

This problem has to be solved end to end. It can be done with just vision as humans can drive with just vision. It's literally just a software, machine learning and data problem. Just takes time

[+] colesantiago|4 years ago|reply
If you are a 'self driving' car company that claims that they are 99% done with self driving and you are still having this issue:

> ...still getting stymied by traffic cones

Maybe you're nowhere done or near to that 99%, perhaps around 10% done considering that this doesn't work worldwide either so seems like 5% to me.

[+] digdigdag|4 years ago|reply
Throwing random percentages is silly. In reality, these vehicles have only been tested within a very tight scope, in near-controlled conditions in Arizona, California, etc. with very specific conditions for their operations.

I would consider it 99% complete when I can pull up a map, point to any stretch of drivable road in the United States and ask it to autonomously operate there.

Specifically, there's an especially tricky road in I-35N in Texas where the lane markings come ago, yellow shoulder markings _merge_ into barriers, and the road condition is so bad that the steering rack will _turn_ by itself(!). We're not even close to calling this thing autonomous. If and when it gets to that point, I would consider it "99%" complete.

In reality though, I really see this tech being viable in continued controlled conditions. Maybe there will be a point when certain lanes will become dedicated "autonomous lanes", where they're completely isolated from the rest of traffic, come with special markings and sensor suites to assist driveler-less cars operate efficiently.

As long as the chaotic human element is ever present, regardless of how well they model these systems, it will fail to cope with the massive amount of discontinuous information humans inject into a situation (swerving into a lane, road rage, cargo suddenly coming loose from a truck, etc.)

Oh, and have we tested these things during winter? :p

[+] throwaway09223|4 years ago|reply
I have elderly relatives who won't drive at night, or on highways. I have friends who won't drive in various degrees of inclement weather - and I know a LOT of Californians who can't drive in the snow.

Human skill is extremely variable. A car that can drive to any road under any condition will be vastly superior to the average driver.

Reducing such a complex system to a single percentage may be silly, but the idea that a car might have partial coverage under certain conditions is quite reasonable and is reflective of how real human drivers work as well.

[+] lisper|4 years ago|reply
What could "99% complete" even possibly mean? That it goes 99 miles/days/hours/minutes before hitting a tree?
[+] bhawks|4 years ago|reply
> Specifically, there's an especially tricky road in I-35N in Texas

Why are we ok with streets in such slapdash condition in the first place?

Yes I think self driving cars need to handle it, but I also think we aren't handling it.

Self driving cars have the potential to greatly reduce deaths due to accidents and improve accessibility for often ignored segments of our population. It is sad to see it so close, but held back by reasons like this.

[+] kenhwang|4 years ago|reply
Google Maps navigation can't even reliably call out which lane to use for turning in major metro areas, and that's kinda the basic best case scenario: clear images/footage, unlimited compute time, static road markers, common use case.

I'm not sure how they can do better with far more stringent resources and requirements.

[+] skybrian|4 years ago|reply
Yes, it’s kind of a silly headline. A percentage compares two numbers, in the numerator and a denominator. If you don’t know what’s being counted or measured, it’s not a real statistic.

However, people do uses percentages metaphorically and we should try not to be so distracted by silly headlines.

[+] jandrewrogers|4 years ago|reply
My current work involves ground-truthing sensor and classifier hits from various automotive OEMs. Take, for example, reading old-fashioned speed limit signs. In principle, doing this correctly and reliably in context is a relatively simple machine learning problem, particularly when restricted to a single country.

I have yet to work with an automotive OEM that doesn't consistently mis-classify some speed limit signs in some common cases that are completely unambiguous. It leads one to wonder about the quality of classifier design and testing such that these end up in production vehicles. The cases that cause a classifier to fail are different across automotive OEMs too, it isn't intrinsic; other automotive OEMs will handle that specific case just fine.

On the other hand, when it works it often works surprisingly well. The good/bad news is that most of the problems I see are ultimately caused by poor model design and testing. These are all surmountable with better quality AI engineering.

There is more AI to self-driving cars than the classifiers that pull features out of their environmental sensors. However, the driving AIs are making decisions based on the output of those classifiers -- garbage in, garbage out.

These are the kinds of situations that will produce relentless incremental improvements over time, and there is still a lot of low-hanging fruit to improve on.

[+] sgtnoodle|4 years ago|reply
That stretch of road you describe as difficult is actually not difficult at all for Waymo's current technology. Their navigation does not depend on vision.
[+] bastardoperator|4 years ago|reply
You should come drive some of the mountain roads in CA. I wouldn't consider roads in TX any trickier than roads elsewhere.
[+] cmroanirgo|4 years ago|reply
This has always had me wondering about the viability of self driving: what's it going to be like in different countries?

Not only are there different signs, different line marking & colours, & different widths. Cars are different: not all models are sold equally throughout the world. People are different: through their clothing, height and build. But the really big difference is that not everyone drives on the RHS...

The article lists left turns as a problem, whereas on a LHS road system, right hand turns will be the problem.

In NZ (a LHS system), I've heard that turning right across traffic has priority over oncoming traffic.

In Melbourne, VIC Australia (also LHS), right turns are made from the left most lane because of trams lines.

I'm sure there are loads of other issues specific to other geographies.

To me, this never ending self driving experiment screams out that the wrong solution is being pursued. (That doesn't imply that I know what they should be using as a solution)

[+] kurthr|4 years ago|reply
I think responding to the human element is huge for safe driving. You would not want to safely drive the same way in LA as Boston, or Seattle. Other drivers make very different assumptions on your response to their (and others) actions, and that can create danger irrespective of navigation.
[+] dathinab|4 years ago|reply
I e.g. think that many self driving car systems are good enough to drive on German highways (and kraftfahrstraßen ~ slower smaller highways).

But only IFF there is no external accident and no construction site.

If both would be handles truckers could sleep/nap while driving on the highways, which could have interesting effects tbh.

Handling construction sites could be made feasible.

But I have no idea how to handle accidents happening around/in-front of the truck in a good way.

Similar I would be worried about improvised marking of fresh accident sites before emergency responders arrive...

[+] bryanlarsen|4 years ago|reply
That awful stretch of I95 can be special cased. It's a great example of what a computer can be better at than humans who each have to individually figure it out.

An unprotected left turn is probably a harder problem. They'll be expected to do it aggressively like humans, and also expected to do it safely. Those are incompatible directives.

[+] samstave|4 years ago|reply
I almost feel that relatively short range (meaning they can make the hops between charging stations) self-flying passenger drones/planes would be better than cars. Leet air-traffic control systems with decades of anti collision experience interact with and track all flights and inform all flying systems of the telemtry data for all slights so they have spatial awareness that way. Set flight paths that are low, but avoid densly populated areas/neighborhoods/infrastructure etc.

Create small emergency landing areas all over...

The only ground based vehicles I care about being 99% auto would be long-haul trucking. With a station to pickup a local last-mile-human-driver for actual drop-offf etc.

[+] mdasen|4 years ago|reply
> In reality, skilled disassembly is required. Engineers must take apart the cars and put them back together by hand. One misplaced wire can leave engineers puzzling for days over where the problem is, according to a person familiar with the operations who describes the system as cumbersome and prone to quality problems.

I think this is one of the more interesting pieces in the article for me. We all kinda know by now that there are issues around AI handling all driving situations. I think it's interesting to hear Waymo having difficulty with the manufacturing aspect.

It does make a certain amount of sense. Waymo is a company that is mainly trying to solve the AI problem around self-driving cars, but other things can stymie those efforts like assembly. This isn't meant as a dig at Waymo, just more an observation that just because a company is excellent in one area doesn't mean that they will be excellent in all areas and that even excellent companies like Alphabet/Google/Waymo can stumble in new areas.

[+] MathMonkeyMan|4 years ago|reply
I don't know anything about AI, but I've heard that asking "how do humans do it?" is not the right question.

Still, though, how _do_ humans drive? One answer that comes to mind is "humans are smart." True. But I suspect a severely "dumb" person can drive safely in a variety of novel scenarios.

It makes me think of child development, all that time making sense of depth perception, distinguishing objects, fine motor controls, expectations of motion, etc. It takes years. Could a non-human animal trapped in a "car suit" drive effectively, if it could somehow overcome the panic? Maybe it could maneuver safely, but what about a four-way stop?

I have no idea what it takes to drive a car in general, and specifically I don't know how humans do it. Where do you even start when trying to model the task in a computer?

[+] yawaworht1978|4 years ago|reply
Yes full self driving is not happening any time soon unless the roads and cars are purpose built for it. Everyone here has worked in some sort of tech company where some procedures are just wrong, some training material doesn't work, or the content is full of rot and unmaintainable, sometimes the code of the product has a couple bugs, never to be fixed, or the back office. Or better how often people make human mistakes and general imperfect things are wiped under the carpet. Code testing? Happens on all new features and to what degree? Think of all these things and then reflect on the possibility to fsd implementation within, say 10 years. Not going to happen.
[+] superkuh|4 years ago|reply
It's pretty much the other way around (1% done, 99% to go) except for contrived locations that have perfect climates. It's like wifi devices claiming they can work over miles but that's only true if you have line of sight.

A long term snow covering of lane markings typical for the winter months of much of north america and the emergent driving lanes people flock to cannot be handled by any existing driving AI. The places people drive are wrong according to absolute road positions and the relative markers are obscured to both human and machine. Unless car AIs can do the wrong thing like all the humans will in those situations it won't work. And that's a hard problem.

[+] Arnt|4 years ago|reply
Aren't you then saying that people from California or Arizona who come to Norway know 1% of what they need in order to drive?

I don't know anyone from Arizona, but the Californians I've known adapted quickly.

[+] NonContro|4 years ago|reply
Right now, our roads are only designed to be human-readable.

But what if machines become the dominant drivers? We need to make the roads machine readable: Road signs redesigned 'QR Code' style, maybe even some kind of wireless broadcast system to communicate traffic light changes.

[+] acdha|4 years ago|reply
That’s really expensive on top of the inherent inefficiency of cars: you’re paying a ton of money for something which uses a lot of energy and pollution (yes, even BEVs) to carry slightly over one person on average. Rebuilding the road system won’t change that or make climate change go away, especially since you’d need a lengthy transition period.

What might make sense is limited deployment in areas where the problem can be constrained: dedicated bus routes, truck convoy lanes on an interstate, etc.

For the rest of it we should be focusing on how to get people out of cars since even BEVs pollute far more than buses, rail, bicycles, or walking.

[+] arijun|4 years ago|reply
I work in the area. I would say the static stuff that’s hard is stuff that’s also hard for human drivers, like poor lane markers or ambiguous signage. So I’m not sure that QR codes will help—-just fix the signage.
[+] DasIch|4 years ago|reply
Machine readable roads already exist. They are called train tracks and are already used by autonomous trains. They also allow for far more energy and space efficient transport with better throughput than cars.
[+] ysavir|4 years ago|reply
What if we put some metal railings on the ground and had the cars follow them?
[+] rcxdude|4 years ago|reply
Reading signs is the really, really easy part. If your vision system can't read and recognise the signs and signals reliably, it's not even beginning to solve the other problems. Trying to adapt the roads to make this part easier is a terrible return on investment. 99% of what you could do to make roads easier to use for self-driving cars is basic maintenance: making sure the signs, markings, and signals remain clearly visible, which also helps improve the performance of human drivers.
[+] ulnarkressty|4 years ago|reply
I remember watching a documentary on Discovery in the early 90s where they showed a motorcade of 10 Audis closely following each other using radar and some magnetic markers embedded in the road surface. They could maneuver within inches of one another, stop on a dime etc. I wonder how much embedding such passive markers into the new roads would have cost compared to the amount of money pumped into autonomous driving companies so far.
[+] treis|4 years ago|reply
From anywhere to anywhere is also an unnecessarily lofty goal. Bringing me to the front of a store in a strip mall is nice, but dropping me off at the street is good enough. Even doing enough streets in a city to pickup/dropoff within 1/4 mile would be revolutionary.
[+] aetherson|4 years ago|reply
I don't think that reading the signs and the traffic lights is the significant challenge here.
[+] KaoruAoiShiho|4 years ago|reply
We can... but it's not needed at the rate things are going. AI will be able to understand almost everything humans can and it's way cheaper to not have to rebuild things for machines.
[+] dathinab|4 years ago|reply
If the last 1% is as hard as the previous 99% then they are not at 99% but at 50% ;=)

It's just that a part much larger then 1% of the way is perceived to be just 1%. I.e. there is a insane lot of hidden complexity generally/often overlooked.

[+] trhway|4 years ago|reply
i think one of the main principal errors on Waymo (and other large players) part is not working with military. Given that for military the task is simpler while safety requirements is significantly laxer, i'd have expected that we'll see self-driving tanks/transports/etc. well before cars.
[+] qeternity|4 years ago|reply
We have this. They’re called drones.

One of the reasons tanks exist is to protect the operators. If you’re not going to have boots on the ground in the first place (autonomous tank), you can rethink the vehicle completely.

[+] fridif|4 years ago|reply
Scrap the whole thing and start over.

Deliveries should be done by autonomous air drops.

Transportation by autonomous aircraft.

[+] majkinetor|4 years ago|reply
Yeah, use air and all nasty problems go away. I am shocked that tech isn't mainstream already given multitude of benefits compared to terrestrial transports.
[+] sgtnoodle|4 years ago|reply
I'm working on the air drops at least.
[+] hartator|4 years ago|reply
Well, if I code something that just move the car forward if nothing is in front, I am probably 80% there. Not sure if I trust their 99%. It seems an excuse for academics to not release an actual technology that works.
[+] bko|4 years ago|reply
Waymo has raised over $5.5 billion and has thousands of employees without so much as a product.

If this were any other startup it would be considered insane and never gotten to this point. Don't know why people keep dumping money into this project thinking they're just around the corner.

Even Tesla sold some cars when it raised a fraction of that amount, and $70 million came from Musk

> By January 2009, Tesla had raised US$187 million and delivered 147 cars. [0]

[0] https://en.wikipedia.org/wiki/History_of_Tesla,_Inc.

[+] ilaksh|4 years ago|reply
They have had a working product for many months.

And although it does occasionally have issues, it is remarkably reliable.

[+] tialaramex|4 years ago|reply
So, wait, you believe Waymo One is free? That seems like a pretty significant subsidy for everybody who lives in the region served by it.
[+] ctvo|4 years ago|reply
Because when they succeed it will return more than the investment? By many times?

The product is being tested. Many things, including life changing drugs, are developed this way.

We both agree Waymo definitely doesn’t sell its half baked product calling it Full Self Driving before it’s ready.