top | item 36433246

San Francisco fire chief fed up with robotaxis that mess with her firetrucks

138 points| tafda | 2 years ago |latimes.com

143 comments

order

rurp|2 years ago

The specific examples listed out in the article are egregious, causing real harm. The fire chief is right to be angry, if anything her response is too measured.

From the article:

- Running through yellow emergency tape and ignoring warning signs to enter a street strewn with storm-damaged electrical wires, then driving past emergency vehicles with some of those wires snarled around rooftop lidar sensors.

- Twice blocking firehouse driveways, requiring another firehouse to dispatch an ambulance to a medical emergency.

- Sitting motionless on a one-way street and forcing a firetruck to back up and take another route to a blazing building.

- Pulling up behind a firetruck that was flashing its emergency lights and parking there, interfering with firefighters unloading ladders.

- Entering an active fire scene, then parking with one of its tires on top of a fire hose.

Projectiboga|2 years ago

Time for Zero Tolerance, if one of these do it seize it, and levy the robotaxi company and whomever built it with Huge Fines. Not "oh that will just be another expense", make it that those fines pierce the corporate veil and get to the executives and the millionaire shareholders. And now that I think of it, if there are any passengers, arrest them. That way they can sue the RoboTaxi company too.

whartung|2 years ago

There's that scene from a TV show where the fire fighters are at a fire, and a car is in front of the fire hydrant. One of the characters calls out "Car!" and they then proceed to smash the windows, and route the fire hose through the car and connect it to the fire hydrant. Throughout the course of the incident, the hydrant leaks and fills the cars interior with water.

If only there was, perhaps, a similar effective, and cathartic response to these things.

bunga-bunga|2 years ago

That list is almost funny, as if they’re already trying to kill us.

c00lio|2 years ago

If this were a human driver, at least half of those would loose you your driver's license. Why are there no such consequences for the companies?

choppaface|2 years ago

Uber has been and continues to be just as egregious in SF. I’ve been hit by two Ubers, both when I was in the crosswalk, and one incident caused me a $500 loss. There are Uber Eats double-parkers who stand in queues all night long making key roads unsafe or impassible. I’ve been in an Uber where the driver was literally trying to brake-check and cause rear-enders and he wouldn’t let me get out until I opened the door while the car was in motion. I’ve messaged Uber support dozens of times and they of course do nothing.

Uber is clearly a different problem, but a much bigger one in the macro.

jeffbee|2 years ago

These articles always conflate all industry participants but when you dig into it, it's always Cruise that is causing the problem. SFMTA's complaint to the state about Waymo cites 13 incidents in Appendix B involving Cruise cars. The very best thing Waymo could do is lobby the state government to establish strict rules so their own reputation doesn't get diluted by Cruise.

There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these. It's the systematically better way to go.

rurp|2 years ago

> There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these.

That's a bold claim. Do you have evidence to support it? I'm not talking about results from controlled or ideal driving conditions, I mean evidence showing that self driving will definitely be better at the conditions those quarter million accidents occur in.

Perhaps there's some research I'm unaware of but as far as I know the best we can say is that self driving might end up being safer for general usecase driving, but that there are wide disagreements about how likely that chance is.

hamolton|2 years ago

After riding in both Cruise and Waymo cars in SF, I think that Waymo cars are so much more road-ready. While my Waymo rides all seemed pretty smooth albeit with a timid driver, my cruise rides featured extremely skittish behavior around other cars, missing several turns to avoid being around others, and stopping in odd places, especially for dropoff and pickup.

foobarbazetc|2 years ago

Yup. Waymo >>>>>>>>>> Cruise.

It’s not even close.

Still wouldn’t get into a Waymo for another 3-5 years, but the quality gap is huge.

kakuri|2 years ago

I have seen several Waymo cars stopped for no good reason or blocking the street while very slowly trying to figure out what to do.

dopylitty|2 years ago

It’s absolutely insane that automated driving is allowed on public streets.

It should be completely banned until such time as there exists a comprehensive testing regime for validating that the automated system functions in all the scenarios it might be expected to face including emergency vehicles, construction, pedestrians, bad weather, and damaged sensors.

Sharlin|2 years ago

To be fair, human drivers are not validated either to function correctly in those scenarios. Though it would certainly be nice if they were.

phpisthebest|2 years ago

it is clear to me that most if not all of Human Progress will cease to advance as we have gone to the extreme with safety culture.

if we were still using Horse drawn carriages for travel, and someone invented the first Horseless Buggies today it would be banned and never allowed to advance at all. The amount of death and injury from the inception of the automobile would never be allowed in a new industry today. We have rationalized and assimilated the everyday human automobile into our lives, but refuse to accept any risk for something new

mikewarot|2 years ago

AI training data doesn't include public safety scenarios is malpractice at minimum. This is nuts.

There is sufficient evidence that these are not ready for prime time.

tomohawk|2 years ago

A family member was a NY fire fighter many years ago. The Soviets had a lot of people at the UN, and they had diplomatic plates. They would park wherever they wanted, often blocking things. There was nothing the police could do.

The fire chief decided to have fire drills. Whenever a cop saw a car with diplomatic plates parked in front of a hydrant, they would call the fire department, who would come out and perform a drill. Upon seeing the vehicle blocking the hydrant, they would practice breaching through the vehicle to attain access to the hydrant. The vehicle would not survive in a drivable state.

The Soviets complained, but stopped blocking fire accesses.

EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.

i_am_jl|2 years ago

>EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.

I know this was tongue in cheek, but it underscores a problem I have with this article title.

The problems, despite the article's title, usually don't involve robotaxis vs trucks, but robotaxis vs active firefighting scenes. The article I read earlier this year referenced cars running over hoses and inching towards firefighters after being blocked by an active firefighting scene.

https://www.forbes.com/sites/bradtempleton/2023/01/31/san-fr...

FireBeyond|2 years ago

One of the challenges with this is that often city/urban fire engines have front bumper discharge lines so that hose can be hooked up to the front of the engine, not just the sides (this is done to improve access options - 1 3/4" hose is fairly flexible, but when you get to 2 1/2" lines or 3"+ supply lines, they need space, and are not overly flexible.)

See: https://firematic.com/trucksnew/levittown19/2.jpg - 1 3/4" attack line on the right with plumbing, and 5" LDH (large diameter hose) supply line on the left.

kindatrue|2 years ago

Damn that's rough. I saw a car parked in front a fire hydrant at a fire in Queens once. FDNY simply smashed the windows to open the doors, and then ran a hose through it.

bell-cot|2 years ago

Heheheheh. Bare-knuckled, but I like that approach.

Beyond instructing emergency responders "if it doesn't stay away, just smash its windows" (mentioned in the article, at n=1 scale)...I wonder if robotaxis have some convenient central depot, which might find its driveways block by emergency street repairs, or some such.

yardie|2 years ago

It appears these robotaxis need an emergency button similar to the ones on industrial robots. But instead of completely stopping, a remote operator immediately takes over and bugs out. Meanwhile, the AI can take a backseat and observe how to respond in a situation like this. Because obviously they don't have enough training when dealing with emergency situations.

pbhjpbhj|2 years ago

The closest I've come to this scenario is playing games (eg CSGO) where you can take over a bot. So often, I take over a bot (without spectating their play first) and immediately die because I have no context for their situation and they're about to be shot at; in many of those situations I feel there bot would have done better than I did.

This makes me feel a human operator probably would be no good at taking control ... however a computer that has access to more data (eg data from other vehicles), or that has more control (of other vehicles, of traffic lights, of pedestrian crossings, etc.) or more capabilities, might be far more successful than a person?!

nradov|2 years ago

That is not an adequate solution. During major public safety incidents, cellular data networks frequently go down or at least suffer from serious lag.

rcme|2 years ago

The article mentions that employees can take remote control over the cars but sometimes that’s not enough to fix the issue.

wiihack|2 years ago

I think in case of an emergency the doors of the car should get unlocked and someone nearby should overtake

jb12|2 years ago

This past weekend in SF, I saw a driver slow and stop to reverse parallel park on Bush Street at Fillmore. An automated Cruise car stopped right behind him, blocking the driver from reversing into the parking spot. The guy got out to yell at the driver to back up, but then saw that it was driverless.

He drove off, and someone else got the spot.

jeffbee|2 years ago

Did they signal? This has happened to me many times with humans driving right up on my bumper even while I signal and am in reverse gear.

jayrot|2 years ago

It's not a bug it's a feature?

HankB99|2 years ago

I'm surprised no one has mentioned the US legal system as part of a solution. If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.

I suppose the drawback to this strategy is that real harm has to happen first and that could easily involve loss of life or limb, but perhaps the threat of that would be enough to motivate the robo-taxi providers to fix the problem.

anigbrowl|2 years ago

If the US legal system is so effective why are there so many personal injury cases? Growing up abroad, US litigousness was a joke when I was in elementary school (>40 years ago). Other developed countries seem to have significantly lower problems with avoidable mortality; maybe the American approach is just not that great.

JumpCrisscross|2 years ago

> If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.

There haven’t been many cases of actual harm. This article pretty much lists out situations which could cause harm but didn’t, due to factors out of the operator’s or fire department’s control.

Sharlin|2 years ago

Yes, but it seems here that the regulators are bending over backwards to dole out permits for these vehicles without bothering to think about the externalities. Still, one would think that 1) the operators could be charged with obstruction of emergency services and/or 2) they could be sued for damages in the civil court.

jokoon|2 years ago

It's weird, I'm a developer, and I really want AI to work, but it just doesn't, so I'm just generally against AI in general. And it seems it's far, far from working properly.

It's difficult to see domains where AI can really improve productivity without having major drawbacks.

I'm not against research on AI, but as long as science cannot define what intelligence really is, I guess AI will not make major advances.

swayvil|2 years ago

It's almost as if navigating reality requires something better than logic and maps. Like humans have magic eyeballs or something.

mrobins|2 years ago

Bare minimum these cars should have the equivalent of a fire service key like elevators.

cirrus3|2 years ago

That isn't a solution. Any time spent dealing with the car is a failure.

> The fire chief said each robotaxi company offers training to help deal with “bricked” vehicles. > “We have 160,000 calls a year. We don’t have the time to personally take care of a car that’s in the way when we’re on the way to an emergency,” she said.

junon|2 years ago

You can't steal an elevator, but you can steal a car. This system would almost definitely be abused.

nycdatasci|2 years ago

The incidents in the article are inexcusable. But to be fair, I manually reviewed a dozen accident reports on the CA state website and 100% were due to human error in other vehicles or due to issues with an operator of an AI vehicle in manual mode: https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

vostrocity|2 years ago

Dang, was it common knowledge that Apple is testing its autonomous driving tech on public roads? I had no idea!

JoeAltmaier|2 years ago

Some simple infrastructure could help with this. A no-fly zone for robo-stuff that travels with fire engines for instance. Require all robo companies to respect these zones. Radio or something.

nordsieck|2 years ago

I do think that, as long as someone hasn't yet invented the sufficiently smart self driving car, that robo-taxis will be required to have some sort of override mode, either cell or satellite based; something where a remote human operator can take over, using the sensor data to drive the car until it can get back to situations where it can actually handle.

I understand that lag can be an issue, but if speeds are limited to 5-10 mph, that should be less of a problem.

mint2|2 years ago

How do these companies not have a simulator built to help train cars for simulated non-standard scenarios? Have dogs, kids, fire trucks, caution tape etc and run them with 100k variations.

And then do real world staged validation.

Actually why are extensive real world mock scenarios not running 24/7? If a car does something bad, do mock scenarios around it.

Please tell me they are doing these things because it’s crazy if they aren’t.

Reubachi|2 years ago

Of course they are...I'm sure you've seen the dozens of marketing videos of tesla detection cameras/sensors stopping short for a kid chasing a ball into the street.

now, like anything else, does test/staging translate to production? Not even remotely. Automomous vehicle manufacturers claim to be in a monitoring/production phase to compare to their internal testing/staging. And over time we'll normalize Autonomous driving and the associated risks.

We used to drive all the nails in the world with a hammer precariously aiming at your own hand.

minwcnt5|2 years ago

They are. At least Waymo has published details about their closed course testing. Dunno about Cruise.

maerF0x0|2 years ago

Seems to me a simple regulation like a fire department access in buildings could solve this.

eg1: What about a direct line to a 24/7 operations center with cruise?

eg2: Or how about a proximal access console controller that allows them to take control of the the vehicle (with occupants consent if occupied) .

This neednt be a total ban on AVs

7952|2 years ago

That seems like a solution in an emergency, but these kind of interactions must happen all the time. And it is part of a wider problem where the baseline data about the world changes and not communicated in a standardised way. It could be a fire truck, road closure for cycling event, tornado ripping through a town, a terrorist attack, a secret service motorcade, oil on the road, anything. And dealing with those situations is much easier with that data. Making that data explicit could help everyone, not just self driving cars.

It would be useful if vehicles with sirens/emergency lights/hazard lights could broadcast position and identity in a similar way to ADS-B. Make data open and allow third parties to forward, store and re-broadcast that data. Make the consumer of the data (such as a self driving car) responsible for how it uses that data.

srj|2 years ago

The time it would take to pull out a phone and call a Cruise support line is already too long.

tpl|2 years ago

Why are the people operating the trucks not getting ticketed for all this then? These various programs should be accruing points on their license just like a real driver. Cruise would probably be off the road if they did this though.

BurningFrog|2 years ago

Human driven cars have a "tell the driver what to do" function that SDCs are still missing.

This will have to be added somehow.

anigbrowl|2 years ago

Welcome to Johnny cab

crmd|2 years ago

>Under the agency’s own rules, issues such as traffic flow and interference with emergency workers can’t be used to deny expansion permits. The resolutions list four “goals” to be considered:

inclusion of people with disabilities

improved transportation options for the disadvantaged

reduction of greenhouse gases; and

passenger safety

pixel3234|2 years ago

Most countries have penalty points on driving license. If some driver is violating traffic rules too often, they should lose driving license! Self driving cars should not have an exception!

dylan604|2 years ago

are you suggesting this on a per car basis, or all cars working off of the same point system? accruing enough points to unlock "suspended license" achievement would seem appropriate to be applied to the entire fleet since it's the same "AI".