I've seen this story making the rounds, but this isn't news, is it?
All self-driving companies maintain teams that make a decision when the cars get confused or stuck, and they report the number of such handoffs to NHTSA.
Is it just that there are teams in the Philippines specifically?
1) "Often" is a gross mischaracterization. It's so infrequent you wouldn't believe. Nearly all rides are performed fully autonomously without human intervention. But "often" sure sounds spicy!
2) "its autopilot is just guys from the Philippines": no, it's not. A human is in the loop to help hint to the Waymo Driver AI platform what action to take if its confidence level is too low or it's facing a particularly odd edge case where it needs to be nudged to take an alternate route. This framing makes it sound like some dude in Manilla is remote controlling the car. They're not. They're issuing hints to and confirming choices by the Waymo Driver which remains in full control of the vehicle at all times.
Because lay people, even non-technically-sophisticated lay people naturally start wondering "well, isn't there some delay between a person in the Philippines and the car in the US? how could that be safe? what if the internet dips out or the connection drops?" Which are good and valid points! And why this framing is so obnoxious and lazy. The car is always driving itself.
They finally issued a correction in the linked article that makes it clear they're not remote controlling the cars, but the headline is still really slanted and a frustrating framing. When you ride in these things, you can see just how incredible this technology is and how far we've come.
Yes it’s just because of the Philippines. The mention of the Philippines is triggering some additional scrutiny. When Waymo themselves announced this back in 2024[1] they made no mention of the country where these humans are located. Now people are raising questions about data sovereignty and local training such as U.S. driving license. Or if you are cynic you can say it’s xenophobia.
Partially, but also as an easy attack on "bad big tech" and AI - it wins votes right as primary season starts gearing up [1] during what is being treated by both the DNC and the GOP as a highly competitive election [0].
Ed Markey is going to face a severely harsh primary this election cycle (as are other incumbents in both parties this season).
Yes, I think this is counting on the ignorance that people will believe there are "drone operators" at the console, halfway across the world, who are driving our cars [A.I. stands for "Actually Indians"?]
The way I understood the liability conversation, several years ago, was that each "autonomous vehicle" would have a corresponding operator of record, a licensed driver, who would be the responsible person for the vehicle's behavior. That there would be a designated person to carry insurance and licensing and be personally responsible and personally answer to criminal or civil charges if "their" vehicle got in a fix.
Honestly this model doesn't make any sense, as Waymo has set it up so that the only driver is the Waymo Driver making decisions, because the Waymo Driver is the only one who's privy to 100% the real-time data.
The remote CSRs, whether they're in Philippines or stateside engineers on an escalation, are explicitly not driving the car but giving it suggestions. If they need someone to "drive the car" they literally dispatch a human who gets behind the wheel, and that's how it works.
It's the question of whether these teams are composed of people who can pass a driving test in Waymo's areas of operation. I would be doubtful that they aren't but there appears to be no way for external verification of any kind.
Think it is a much bigger deal than you’re making out, because we don’t have figures on how often the cars need assistance.
We assume it’s just occasionally but we don’t actually know that. They could be requesting assistance constantly and Waymo would have an incentive to keep that hush-hush. Certainly would not be the first time a big SV company has faked it until they technically worked.
At worst, it's the outsourcing of cab drivers to remote roles in cheaper countries. No problem for the investors who are banking on another market disruption, since they're leaving the local society to hold the bag.
Just been reading about the crash they’re talking about in the article - it seems like a kid walked out from between two parked cars.
Rather than being a bad thing, this is probably Waymo saving his life.
It says the car reduced speed from 17mph to 6mph before contact. This is the kind of reaction-speed safety an AI car should have over a human driver - instead it’s just ‘waymo hit a kid’.
Alternatively, maybe the car should have been driving more slowly. There is not enough information to know whether a human driver would have had a better outcome.
The way this is being reported gives the impression that there are workers in the philippines remotely driving these cars (if they are, maybe google found a good use case for all that Stadia tech).
What it actually is, if the car gets stuck someone can manually override - which, I imagine is normal? If the car gets stuck you can call someone and they can do "something", which can probably nudge the car into action. I doubt the latency is that good where someone can remotely drive the car.
What I found interesting about this is that on several occasions I've seen Waymos get confused and block intersections (and once Muni tracks). Each time they've sat there for at least 10-15 mins until a police officer showed up and tapped on the window. Then it was another 10-15 mins before the vehicle started to move again. What are these agents doing?
From what I understand, Waymo has never hidden the fact that they have remote operators, and also they have clarified that the remote operators are not actually "driving" the car (in the sense that they are not using a remote steering wheel and pedals).
I find this fact to be an interesting litmus test- for example, jwz (who hates self-driving cars, AI, and bigtech) interprets the news to mean the opposite of what I said (it's a bunch of remote workers individually turning steering wheels, etc). While folks who are happier with the product or with tech and latency know that remote driving from 5000+ miles away is not technically feasible.
Waymo seems to be unnecessarily secretive about this. Why not let reporters visit the control centers? Zoox had the New York Times visit one a few years ago. It came out that there are about 1.5 support people per car. Nobody has a steering wheel. They hint to the cars by dropping "breadcrumbs" on screen.
The ratio of workers to cars matters more, imo, than whether the workers drive the cars. The fundamental sell of self-driving is that it saves labor. If it effectively doesn't, self-driving essentially going to a luxury rather than a replacement for the existing models.
I'm guessing it is for situations like should the Waymo stay in a particular lane or switch lanes, try to overtake another car, etc. That's probably the type of "guidance", which seems a lot like optimization.
It's hard to say what this means. They didn't give a single example of the kinds of situations the remote workers help in. I can think of an array of different kinds of situations ranging from "should I continue with this route or turn back" which would be a yes/no dialog box with the car prompting monitoring to real time pedal inputs or emergency stops by people watching the displays constantly.
~~Generally when a company is vague about these things, you should assume there is some very intensive aspect to it undercutting their claims of autonomy or some aspect where people think its dangerous.~~
I mean, Waymo gives a lot of examples of the situations, in their blog post about Fleet Response where they detail this, released May 21, 2024. They're very explicit that the Waymo Driver autonomous system is in control the entire time.
The Amazon Go situation was also wildly misrepresented in media, to be clear. It's fairly obvious that they did actually have some vaguely accurate video processing tech, it's just that the reliability never hit a level that the cost of fixing up errors actually saved money vs the alternative.
(The same consideration also applies to Waymo: even if they are not controlling the car like a RC car, does the cost of running their interventions turn the unit economics of their business upside-down? And if not, would this still be true if they were paying US wages for it?)
How does one turn "Tesla has computer-connected steering wheels at their office" into "Tesla robotaxis as controlled by remote workers with steering wheels on their desks"?
This reminds me of people saying that ChatGPT was actually just quick typists from India, back in 2022.
The resurgence of this seems to be another addition to the sort of culture war that is going on right now around ai v human labour. I suspect this sort of thing will continue to make hay in the press over the coming year
So,... isn't it illegal to do that. If someone in the Philippines does not have a CA/AZ/Whatever driver's license - then Waymo is breaking the law. It's probably worse than that.
It also proves that Waymo's capabilities are overstated. I keep getting pushback when I complain about specific situations in this forum about how Waymo thinks about complex situations - and this entire time, it may have been humans navigating them.
Foreigners can drive in California without a California license. But it's important to note that in this situation they're not driving at all (the latency would make that unfeasible). They're there to disambiguate complicated situations and point the car in the right direction.
Try to not let clickbait headlines shape your view of a situation.
I think this is a key question. In the May 2024 blog post about "fleet response" it sounds like Waymo has a lawyerly set of rules they follow to distinguish between remote operation and providing guidance to the self-driving system.
Much like phone-a-friend, when the Waymo vehicle encounters a particular
situation on the road, the autonomous driver can reach out to a human fleet
response agent for additional information to contextualize its environment.
The Waymo Driver does not rely solely on the inputs it receives from the
fleet response agent and it is in control of the vehicle at all times.
[...]
Fleet response can influence the Waymo Driver's path, whether indirectly
through indicating lane closures, explicitly requesting the AV use a
particular lane, or, in the most complex scenarios, explicitly proposing a
path for the vehicle to consider. The Waymo Driver evaluates the input from
fleet response and independently remains in control of driving.
If a person from the Philippines comes to the USA, they are allowed to drive on our roads as long as they have a valid license in the Philippines (no international permit required).
I would assume that would apply here too.
But also, they aren't actually driving the car. They are giving hints to the autonomous driver.
I watched a recording of the hearing. It sounds a lot like the Amazon Fresh thing, at a glance, but it's not.
Amazon admitted that they had a bunch of people in India looking at the camera feeds and validating orders post-facto. The media took this as "the Indian workers are processing your Amazon Fresh purchase, not the computers" which is disingenuous at best. And yeah, it sounds like Waymos usually, nearly always are fully autonomous.
The huge, gargantuan, enormous difference is that, in Waymo's case, the overseas folks are taking control of a fucking car. That's not post-facto like the Amazon thing. And, more importantly, the ramifications of even the tiniest mistake are massive by comparison.
Indian Amazon guy screws up? Shoot, I paid for two heads of lettuce when I only got one. Filipino Waymo guy screws up? Car accident.
By the way: Imagine driving a real, actual car with trans-oceanic ping.
So everyone saying "oh but they told us this" is completely missing the point; it's like those weird logic problems where everyone on the island has a dot on their head or whatever.
There's a massive difference between "widely known" and "widely known that it's widely known."
people who are looking for work. you will too one day when you're finally kicked out of the pool, and discover your assumptions about why you had work in the past were wrong.
the problem with ignorance is that those who are ignorant aren't able to appreciate the bliss until after it's gone.
I don't have any problem with Waymos having a human in the loop for assistance, but sending all of our jobs to other countries is destroying the United States.
Shouldn't these workers in Phillipines be required to be licensed to drive cars in the USA to operate those vehicles (even remotely)? I understand that they're not really driving those cars. But they've control over these cars and they do operate them when required on public roads.
International drivers are allowed to drive on US roads as long as they have a valid license in their own country. In particular, Filipino drivers are allowed to drive on US roads without any extra paperwork.
But also, even in the USA, we have 51+ different licensing schemes in the US. We already accept that if you have a license in one place, it's good in all the places.
Fun fact, if they are using foreign workers at all, however briefly, they are likely in violation of state law in multiple states.
HOWEVER:
It is entirely possible that some back room deals were made, and possibly laws put on the books in the states they've rolled out in.
I suspect more will come from this, eventually, especially if waymo is involved in accidents that involve insurance claims, injuries, or deaths in one of those states.
IIRC from when Waymo discussed this previously, the remote people don't drive the car, they issue instructions to the autonomous driver. If that's the case they shouldn't need a driving licence.
Wouldn't it be risky to do that? This is a multi-billion dollar gamble being executed in front of the public, egregiously breaking the law or making back-room deals both risk extreme negative public reaction if exposed.
We know that eventually a self-driving car will hit somebody and kill them. Waymo and other companies are prepared for that.
eykanal|19 days ago
simonw|19 days ago
WarmWash|19 days ago
stcredzero|19 days ago
teach|19 days ago
All self-driving companies maintain teams that make a decision when the cars get confused or stuck, and they report the number of such handoffs to NHTSA.
Is it just that there are teams in the Philippines specifically?
disillusioned|19 days ago
Lazy folks are framing this as "see, it's still humans!", like this awful article by TechSpot headlined "Waymo admits that its autopilot is often just guys from the Philippines": https://www.techspot.com/news/111233-waymo-admits-autopilot-...
1) "Often" is a gross mischaracterization. It's so infrequent you wouldn't believe. Nearly all rides are performed fully autonomously without human intervention. But "often" sure sounds spicy!
2) "its autopilot is just guys from the Philippines": no, it's not. A human is in the loop to help hint to the Waymo Driver AI platform what action to take if its confidence level is too low or it's facing a particularly odd edge case where it needs to be nudged to take an alternate route. This framing makes it sound like some dude in Manilla is remote controlling the car. They're not. They're issuing hints to and confirming choices by the Waymo Driver which remains in full control of the vehicle at all times.
Because lay people, even non-technically-sophisticated lay people naturally start wondering "well, isn't there some delay between a person in the Philippines and the car in the US? how could that be safe? what if the internet dips out or the connection drops?" Which are good and valid points! And why this framing is so obnoxious and lazy. The car is always driving itself.
They finally issued a correction in the linked article that makes it clear they're not remote controlling the cars, but the headline is still really slanted and a frustrating framing. When you ride in these things, you can see just how incredible this technology is and how far we've come.
kccqzy|19 days ago
[1]: https://waymo.com/blog/2024/05/fleet-response
alephnerd|19 days ago
Ed Markey is going to face a severely harsh primary this election cycle (as are other incumbents in both parties this season).
[0] - https://www.axios.com/2026/02/06/gop-senate-midterms-2026
[1] - https://m.youtube.com/watch?v=OOwRbOK93ag
RupertSalt|19 days ago
The way I understood the liability conversation, several years ago, was that each "autonomous vehicle" would have a corresponding operator of record, a licensed driver, who would be the responsible person for the vehicle's behavior. That there would be a designated person to carry insurance and licensing and be personally responsible and personally answer to criminal or civil charges if "their" vehicle got in a fix.
Honestly this model doesn't make any sense, as Waymo has set it up so that the only driver is the Waymo Driver making decisions, because the Waymo Driver is the only one who's privy to 100% the real-time data.
The remote CSRs, whether they're in Philippines or stateside engineers on an escalation, are explicitly not driving the car but giving it suggestions. If they need someone to "drive the car" they literally dispatch a human who gets behind the wheel, and that's how it works.
OgsyedIE|19 days ago
stcredzero|19 days ago
Same old same old. Some of them actually know stuff. Others are examples of 20th century "Artificial Intelligence." (Got briefed by their staff.)
thegreatpeter|19 days ago
dwroberts|19 days ago
We assume it’s just occasionally but we don’t actually know that. They could be requesting assistance constantly and Waymo would have an incentive to keep that hush-hush. Certainly would not be the first time a big SV company has faked it until they technically worked.
ljm|19 days ago
jonplackett|19 days ago
Rather than being a bad thing, this is probably Waymo saving his life.
It says the car reduced speed from 17mph to 6mph before contact. This is the kind of reaction-speed safety an AI car should have over a human driver - instead it’s just ‘waymo hit a kid’.
TurdF3rguson|19 days ago
lern_too_spel|19 days ago
nemothekid|19 days ago
What it actually is, if the car gets stuck someone can manually override - which, I imagine is normal? If the car gets stuck you can call someone and they can do "something", which can probably nudge the car into action. I doubt the latency is that good where someone can remotely drive the car.
bink|19 days ago
hettygreen|19 days ago
I always wondered why "Taxi Cab Simulator 7" looked so realistic.
unknown|19 days ago
[deleted]
unknown|19 days ago
[deleted]
dekhn|19 days ago
I find this fact to be an interesting litmus test- for example, jwz (who hates self-driving cars, AI, and bigtech) interprets the news to mean the opposite of what I said (it's a bunch of remote workers individually turning steering wheels, etc). While folks who are happier with the product or with tech and latency know that remote driving from 5000+ miles away is not technically feasible.
LeoPanthera|19 days ago
Animats|19 days ago
jayd16|19 days ago
joe_the_user|19 days ago
labrador|19 days ago
tengbretson|19 days ago
1vuio0pswjnm7|19 days ago
https://www.msn.com/en-us/money/other/waymo-exec-reveals-com...
Text-only, HTTPS optional:
http://assets.msn.com/content/view/v2/Detail/en-in/AA1VL9B3/
https://assets.msn.com/content/view/v2/Detail/en-in/AA1VL9B3...
Simple HTML:
noupdates|19 days ago
ra7|19 days ago
https://youtube.com/watch?v=T0WtBFEfAyo
https://youtube.com/watch?v=elpQPbJXpfY
Notice how the system itself reasons about the scene and asks for help with possible options.
This whole story is a nothingburger. The only “news” here is that the operators are in Philippines.
tehjoker|19 days ago
~~Generally when a company is vague about these things, you should assume there is some very intensive aspect to it undercutting their claims of autonomy or some aspect where people think its dangerous.~~
EDIT: See link below.
Rudybega|19 days ago
This isn't something new.
https://waymo.com/blog/2024/05/fleet-response
lateforwork|19 days ago
This reminds of Amazon Go "Just Walk Out" technology which turned out to be pretty low tech: remote workers in India watching you through cameras.
rcxdude|19 days ago
(The same consideration also applies to Waymo: even if they are not controlling the car like a RC car, does the cost of running their interventions turn the unit economics of their business upside-down? And if not, would this still be true if they were paying US wages for it?)
uyzstvqs|19 days ago
This reminds me of people saying that ChatGPT was actually just quick typists from India, back in 2022.
jsemrau|19 days ago
renewiltord|19 days ago
cal_dent|19 days ago
baxtr|19 days ago
Geste|19 days ago
almosthere|21 days ago
It also proves that Waymo's capabilities are overstated. I keep getting pushback when I complain about specific situations in this forum about how Waymo thinks about complex situations - and this entire time, it may have been humans navigating them.
tredre3|21 days ago
Try to not let clickbait headlines shape your view of a situation.
estimator7292|21 days ago
Did you think we just don't allow foreigners to drive ever?
dcchambers|19 days ago
adolph|19 days ago
jedberg|19 days ago
I would assume that would apply here too.
But also, they aren't actually driving the car. They are giving hints to the autonomous driver.
sanex|19 days ago
ryukoposting|21 days ago
Amazon admitted that they had a bunch of people in India looking at the camera feeds and validating orders post-facto. The media took this as "the Indian workers are processing your Amazon Fresh purchase, not the computers" which is disingenuous at best. And yeah, it sounds like Waymos usually, nearly always are fully autonomous.
The huge, gargantuan, enormous difference is that, in Waymo's case, the overseas folks are taking control of a fucking car. That's not post-facto like the Amazon thing. And, more importantly, the ramifications of even the tiniest mistake are massive by comparison.
Indian Amazon guy screws up? Shoot, I paid for two heads of lettuce when I only got one. Filipino Waymo guy screws up? Car accident.
By the way: Imagine driving a real, actual car with trans-oceanic ping.
verdverm|21 days ago
they aren't though, they are clicking waypoints on a map
ddol|21 days ago
> The Waymo Driver evaluates the input from fleet response and independently remains in control of driving.
https://waymo.com/blog/2024/05/fleet-response
unknown|19 days ago
[deleted]
dboreham|19 days ago
jrm4|19 days ago
There's a massive difference between "widely known" and "widely known that it's widely known."
MaxikCZ|19 days ago
hackeraccount|18 days ago
I mean they should be doing the same thing for truck drivers, train operators and airlines.
xyst|19 days ago
unknown|19 days ago
[deleted]
semiinfinitely|19 days ago
riazrizvi|19 days ago
the problem with ignorance is that those who are ignorant aren't able to appreciate the bliss until after it's gone.
Curiositiy|19 days ago
thegreatpeter|19 days ago
topherPedersen|19 days ago
lugu|19 days ago
osti|19 days ago
heraldgeezer|19 days ago
jp0d|19 days ago
jedberg|19 days ago
But also, even in the USA, we have 51+ different licensing schemes in the US. We already accept that if you have a license in one place, it's good in all the places.
eek2121|19 days ago
HOWEVER:
It is entirely possible that some back room deals were made, and possibly laws put on the books in the states they've rolled out in.
I suspect more will come from this, eventually, especially if waymo is involved in accidents that involve insurance claims, injuries, or deaths in one of those states.
danpalmer|19 days ago
dekhn|19 days ago
We know that eventually a self-driving car will hit somebody and kill them. Waymo and other companies are prepared for that.
rcxdude|19 days ago