top | item 46810401

Waymo robotaxi hits a child near an elementary school in Santa Monica

482 points| voxadam | 1 month ago |techcrunch.com

787 comments

order
[+] BugsJustFindMe|1 month ago|reply
From the Waymo blog...

> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.

> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.

> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.

I honestly cannot imagine a better outcome or handling of the situation.

[+] jobs_throwaway|1 month ago|reply
Yup. And to add

> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”

It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.

[+] scarmig|1 month ago|reply
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
[+] mlsu|1 month ago|reply
An honest account of this situation would place at least some blame on there being a tall SUV blocking visibility.

These giant SUVs really are the worst when it comes to child safety

[+] calchris42|1 month ago|reply
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.

The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.

[+] dcanelhas|1 month ago|reply
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.

What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself

[+] barbazoo|1 month ago|reply
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
[+] mholt|1 month ago|reply
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)

This is the fault of the software and company implementing it.

[+] random_duck|1 month ago|reply
They they are being very transparent about it.
[+] boh|1 month ago|reply
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
[+] dyauspitr|1 month ago|reply
It’s great handling of the situation. They should release a video as well.
[+] chmod775|1 month ago|reply
> I honestly cannot imagine a better outcome or handling of the situation.

It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.

German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.

There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.

I suspect many human drivers would've driven slower, law or no law.

[+] gerdesj|1 month ago|reply
"from behind a tall SUV, "

I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.

However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.

A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.

There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.

I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.

[+] croes|1 month ago|reply
We should take their reporting with grain of salt and wait for official results
[+] ajdude|1 month ago|reply
> reducing speed from approximately 17 mph

Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?

[+] aucisson_masque|1 month ago|reply
Can’t trust a private company.

Where is the video recording ?

[+] dfxm12|1 month ago|reply
Waymo driver? The vehicles are autonomous. I otherwise applaud Waymo's response, and I hope they are as cooperative as they say they will be. However, referring to the autonomous vehicle as having a driver is a dangerous way to phrase it. It's not passive voice, per se, but it has the same effect of obscuring responsibility. Waymo should say we, Waymo LLC, subsidiary of Alphabet, Inc., braked hard...

Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.

[+] oliwarner|1 month ago|reply
> From the Waymo blog...

I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.

Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.

It would be wise to wait and see if the NHTSA agree. Would a driver have driven at 17mph in this sort of traffic or would they have viewed it as a situation where hidden infant pedestrians are likely to step out?

[+] rdudek|1 month ago|reply
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
[+] padjo|1 month ago|reply
It's hardly surprising that the version of events from the PR department makes Waymo sound completely blameless.
[+] alphazard|1 month ago|reply
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions. That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
[+] WalterBright|1 month ago|reply
When I was a boy, I ran into the street from between two parked cars. I did not notice the car coming, but he noticed me popping out from nowhere, and screeched to a stop.

I was very very lucky.

[+] ChrisMarshallNY|1 month ago|reply
I suspect the robotaxi may have done better than a human.

Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.

[+] jeffybefffy519|1 month ago|reply
I wonder if another waymo ahead could have seen that child earlier and told the main waymo. This would be pretty neat and have a large safety impact.
[+] AndrewKemendo|1 month ago|reply
In fact I would call that “superhuman” behavior across the board.

The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.

I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.

[+] zx8080|1 month ago|reply
> remained stopped, moved to the side of the road

Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?

[+] butlike|1 month ago|reply
Take that particular Waymo car off the road. Seems absurd, but they still hit someone.
[+] belter|1 month ago|reply
>> I honestly cannot imagine a better outcome or handling of the situation.

One better than: We investigated our own system and found ourselves to be at no fault?

[+] aanet|1 month ago|reply
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.

Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.

Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.

This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.

[1] Yes, I'm an AV safety expert

[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...

(edit: verbiage)

[+] maerF0x0|1 month ago|reply
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.

I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .

> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.

A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.

[+] naet|1 month ago|reply
We should all think twice before taking a company PR statement completely at face value and praising them for slowing down faster than their own internal "model" says a human driver would. Companies are heavily interested in protecting their bottom line and in a situation like this probably had 5-10 people carefully craft every single word of the statement for maximum damage control.

Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.

Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.

There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?

[+] bhewes|1 month ago|reply
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
[+] dlg|1 month ago|reply
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.

While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.

If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.

[+] rsch|1 month ago|reply
A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there. If that driver would also have been driving a large SUV that child would have been pushed on the ground and ran over, so probably a fatality. And also functionally nobody would have given a shit apart from some lame finger pointing at (probably) the kid’s parents.

And it is not the child’s or their parents’ fault either:

Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)

This is why low speed limits around schools exist.

So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.

[+] Zigurd|1 month ago|reply
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
[+] Zopieux|1 month ago|reply
Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.

Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:

* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,

* safe, separated lanes for biking/walking when that's an option.

[+] Bukhmanizer|1 month ago|reply
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
[+] simojo|1 month ago|reply
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.

This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.

[+] NoGravitas|1 month ago|reply
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
[+] Dlanv|1 month ago|reply
Basically Waymo just prevented a kids potential death.

Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.

[+] Veserv|1 month ago|reply
Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.

US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.

However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.

Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:

A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.

B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.

C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.

[1] https://afdc.energy.gov/data/10315

[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...

[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...

[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...

[5] https://waymo.com/safety/impact/

[+] moktonar|1 month ago|reply
The Waymo driver tech is impressive. That said an experienced driver might have recognized the pattern where a stopped big vehicle occludes a part of the road leading to such situation, and might have stopped or slowed down almost to a halt before passing. The Waymo driver reacts faster but is not able to predict such scenarios by filling the gaps, simulating the world to inform decisions. Chapeau to Waymo anyways
[+] WarmWash|1 month ago|reply
Oddly I cannot decide if this is cause for damnation or celebration

Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

[+] fortran77|1 month ago|reply
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.

Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!

[+] aucisson_masque|1 month ago|reply
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post.

The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.