The specific examples listed out in the article are egregious, causing real harm. The fire chief is right to be angry, if anything her response is too measured.
From the article:
- Running through yellow emergency tape and ignoring warning signs to enter a street strewn with storm-damaged electrical wires, then driving past emergency vehicles with some of those wires snarled around rooftop lidar sensors.
- Twice blocking firehouse driveways, requiring another firehouse to dispatch an ambulance to a medical emergency.
- Sitting motionless on a one-way street and forcing a firetruck to back up and take another route to a blazing building.
- Pulling up behind a firetruck that was flashing its emergency lights and parking there, interfering with firefighters unloading ladders.
- Entering an active fire scene, then parking with one of its tires on top of a fire hose.
Time for Zero Tolerance, if one of these do it seize it, and levy the robotaxi company and whomever built it with Huge Fines. Not "oh that will just be another expense", make it that those fines pierce the corporate veil and get to the executives and the millionaire shareholders. And now that I think of it, if there are any passengers, arrest them. That way they can sue the RoboTaxi company too.
There's that scene from a TV show where the fire fighters are at a fire, and a car is in front of the fire hydrant. One of the characters calls out "Car!" and they then proceed to smash the windows, and route the fire hose through the car and connect it to the fire hydrant. Throughout the course of the incident, the hydrant leaks and fills the cars interior with water.
If only there was, perhaps, a similar effective, and cathartic response to these things.
Uber has been and continues to be just as egregious in SF. I’ve been hit by two Ubers, both when I was in the crosswalk, and one incident caused me a $500 loss. There are Uber Eats double-parkers who stand in queues all night long making key roads unsafe or impassible. I’ve been in an Uber where the driver was literally trying to brake-check and cause rear-enders and he wouldn’t let me get out until I opened the door while the car was in motion. I’ve messaged Uber support dozens of times and they of course do nothing.
Uber is clearly a different problem, but a much bigger one in the macro.
These articles always conflate all industry participants but when you dig into it, it's always Cruise that is causing the problem. SFMTA's complaint to the state about Waymo cites 13 incidents in Appendix B involving Cruise cars. The very best thing Waymo could do is lobby the state government to establish strict rules so their own reputation doesn't get diluted by Cruise.
There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these. It's the systematically better way to go.
> There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these.
That's a bold claim. Do you have evidence to support it? I'm not talking about results from controlled or ideal driving conditions, I mean evidence showing that self driving will definitely be better at the conditions those quarter million accidents occur in.
Perhaps there's some research I'm unaware of but as far as I know the best we can say is that self driving might end up being safer for general usecase driving, but that there are wide disagreements about how likely that chance is.
After riding in both Cruise and Waymo cars in SF, I think that Waymo cars are so much more road-ready. While my Waymo rides all seemed pretty smooth albeit with a timid driver, my cruise rides featured extremely skittish behavior around other cars, missing several turns to avoid being around others, and stopping in odd places, especially for dropoff and pickup.
It’s absolutely insane that automated driving is allowed on public streets.
It should be completely banned until such time as there exists a comprehensive testing regime for validating that the automated system functions in all the scenarios it might be expected to face including emergency vehicles, construction, pedestrians, bad weather, and damaged sensors.
it is clear to me that most if not all of Human Progress will cease to advance as we have gone to the extreme with safety culture.
if we were still using Horse drawn carriages for travel, and someone invented the first Horseless Buggies today it would be banned and never allowed to advance at all. The amount of death and injury from the inception of the automobile would never be allowed in a new industry today. We have rationalized and assimilated the everyday human automobile into our lives, but refuse to accept any risk for something new
A family member was a NY fire fighter many years ago. The Soviets had a lot of people at the UN, and they had diplomatic plates. They would park wherever they wanted, often blocking things. There was nothing the police could do.
The fire chief decided to have fire drills. Whenever a cop saw a car with diplomatic plates parked in front of a hydrant, they would call the fire department, who would come out and perform a drill. Upon seeing the vehicle blocking the hydrant, they would practice breaching through the vehicle to attain access to the hydrant. The vehicle would not survive in a drivable state.
The Soviets complained, but stopped blocking fire accesses.
EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
>EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
I know this was tongue in cheek, but it underscores a problem I have with this article title.
The problems, despite the article's title, usually don't involve robotaxis vs trucks, but robotaxis vs active firefighting scenes. The article I read earlier this year referenced cars running over hoses and inching towards firefighters after being blocked by an active firefighting scene.
One of the challenges with this is that often city/urban fire engines have front bumper discharge lines so that hose can be hooked up to the front of the engine, not just the sides (this is done to improve access options - 1 3/4" hose is fairly flexible, but when you get to 2 1/2" lines or 3"+ supply lines, they need space, and are not overly flexible.)
Damn that's rough. I saw a car parked in front a fire hydrant at a fire in Queens once. FDNY simply smashed the windows to open the doors, and then ran a hose through it.
Heheheheh. Bare-knuckled, but I like that approach.
Beyond instructing emergency responders "if it doesn't stay away, just smash its windows" (mentioned in the article, at n=1 scale)...I wonder if robotaxis have some convenient central depot, which might find its driveways block by emergency street repairs, or some such.
It appears these robotaxis need an emergency button similar to the ones on industrial robots. But instead of completely stopping, a remote operator immediately takes over and bugs out. Meanwhile, the AI can take a backseat and observe how to respond in a situation like this. Because obviously they don't have enough training when dealing with emergency situations.
The closest I've come to this scenario is playing games (eg CSGO) where you can take over a bot. So often, I take over a bot (without spectating their play first) and immediately die because I have no context for their situation and they're about to be shot at; in many of those situations I feel there bot would have done better than I did.
This makes me feel a human operator probably would be no good at taking control ... however a computer that has access to more data (eg data from other vehicles), or that has more control (of other vehicles, of traffic lights, of pedestrian crossings, etc.) or more capabilities, might be far more successful than a person?!
This past weekend in SF, I saw a driver slow and stop to reverse parallel park on Bush Street at Fillmore. An automated Cruise car stopped right behind him, blocking the driver from reversing into the parking spot. The guy got out to yell at the driver to back up, but then saw that it was driverless.
I'm surprised no one has mentioned the US legal system as part of a solution. If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.
I suppose the drawback to this strategy is that real harm has to happen first and that could easily involve loss of life or limb, but perhaps the threat of that would be enough to motivate the robo-taxi providers to fix the problem.
If the US legal system is so effective why are there so many personal injury cases? Growing up abroad, US litigousness was a joke when I was in elementary school (>40 years ago). Other developed countries seem to have significantly lower problems with avoidable mortality; maybe the American approach is just not that great.
> If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.
There haven’t been many cases of actual harm. This article pretty much lists out situations which could cause harm but didn’t, due to factors out of the operator’s or fire department’s control.
Yes, but it seems here that the regulators are bending over backwards to dole out permits for these vehicles without bothering to think about the externalities. Still, one would think that 1) the operators could be charged with obstruction of emergency services and/or 2) they could be sued for damages in the civil court.
It's weird, I'm a developer, and I really want AI to work, but it just doesn't, so I'm just generally against AI in general. And it seems it's far, far from working properly.
It's difficult to see domains where AI can really improve productivity without having major drawbacks.
I'm not against research on AI, but as long as science cannot define what intelligence really is, I guess AI will not make major advances.
That isn't a solution. Any time spent dealing with the car is a failure.
> The fire chief said each robotaxi company offers training to help deal with “bricked” vehicles.
> “We have 160,000 calls a year. We don’t have the time to personally take care of a car that’s in the way when we’re on the way to an emergency,” she said.
The incidents in the article are inexcusable. But to be fair, I manually reviewed a dozen accident reports on the CA state website and 100% were due to human error in other vehicles or due to issues with an operator of an AI vehicle in manual mode:
https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
Some simple infrastructure could help with this. A no-fly zone for robo-stuff that travels with fire engines for instance. Require all robo companies to respect these zones. Radio or something.
I do think that, as long as someone hasn't yet invented the sufficiently smart self driving car, that robo-taxis will be required to have some sort of override mode, either cell or satellite based; something where a remote human operator can take over, using the sensor data to drive the car until it can get back to situations where it can actually handle.
I understand that lag can be an issue, but if speeds are limited to 5-10 mph, that should be less of a problem.
How do these companies not have a simulator built to help train cars for simulated non-standard scenarios? Have dogs, kids, fire trucks, caution tape etc and run them with 100k variations.
And then do real world staged validation.
Actually why are extensive real world mock scenarios not running 24/7? If a car does something bad, do mock scenarios around it.
Please tell me they are doing these things because it’s crazy if they aren’t.
Of course they are...I'm sure you've seen the dozens of marketing videos of tesla detection cameras/sensors stopping short for a kid chasing a ball into the street.
now, like anything else, does test/staging translate to production? Not even remotely. Automomous vehicle manufacturers claim to be in a monitoring/production phase to compare to their internal testing/staging. And over time we'll normalize Autonomous driving and the associated risks.
We used to drive all the nails in the world with a hammer precariously aiming at your own hand.
That seems like a solution in an emergency, but these kind of interactions must happen all the time. And it is part of a wider problem where the baseline data about the world changes and not communicated in a standardised way. It could be a fire truck, road closure for cycling event, tornado ripping through a town, a terrorist attack, a secret service motorcade, oil on the road, anything. And dealing with those situations is much easier with that data. Making that data explicit could help everyone, not just self driving cars.
It would be useful if vehicles with sirens/emergency lights/hazard lights could broadcast position and identity in a similar way to ADS-B. Make data open and allow third parties to forward, store and re-broadcast that data. Make the consumer of the data (such as a self driving car) responsible for how it uses that data.
Why are the people operating the trucks not getting ticketed for all this then? These various programs should be accruing points on their license just like a real driver. Cruise would probably be off the road if they did this though.
>Under the agency’s own rules, issues such as traffic flow and interference with emergency workers can’t be used to deny expansion permits. The resolutions list four “goals” to be considered:
inclusion of people with disabilities
improved transportation options for the disadvantaged
Most countries have penalty points on driving license. If some driver is violating traffic rules too often, they should lose driving license! Self driving cars should not have an exception!
are you suggesting this on a per car basis, or all cars working off of the same point system? accruing enough points to unlock "suspended license" achievement would seem appropriate to be applied to the entire fleet since it's the same "AI".
rurp|2 years ago
From the article:
- Running through yellow emergency tape and ignoring warning signs to enter a street strewn with storm-damaged electrical wires, then driving past emergency vehicles with some of those wires snarled around rooftop lidar sensors.
- Twice blocking firehouse driveways, requiring another firehouse to dispatch an ambulance to a medical emergency.
- Sitting motionless on a one-way street and forcing a firetruck to back up and take another route to a blazing building.
- Pulling up behind a firetruck that was flashing its emergency lights and parking there, interfering with firefighters unloading ladders.
- Entering an active fire scene, then parking with one of its tires on top of a fire hose.
Projectiboga|2 years ago
whartung|2 years ago
If only there was, perhaps, a similar effective, and cathartic response to these things.
bunga-bunga|2 years ago
c00lio|2 years ago
choppaface|2 years ago
Uber is clearly a different problem, but a much bigger one in the macro.
jeffbee|2 years ago
There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these. It's the systematically better way to go.
rurp|2 years ago
That's a bold claim. Do you have evidence to support it? I'm not talking about results from controlled or ideal driving conditions, I mean evidence showing that self driving will definitely be better at the conditions those quarter million accidents occur in.
Perhaps there's some research I'm unaware of but as far as I know the best we can say is that self driving might end up being safer for general usecase driving, but that there are wide disagreements about how likely that chance is.
hamolton|2 years ago
foobarbazetc|2 years ago
It’s not even close.
Still wouldn’t get into a Waymo for another 3-5 years, but the quality gap is huge.
kakuri|2 years ago
dopylitty|2 years ago
It should be completely banned until such time as there exists a comprehensive testing regime for validating that the automated system functions in all the scenarios it might be expected to face including emergency vehicles, construction, pedestrians, bad weather, and damaged sensors.
Sharlin|2 years ago
phpisthebest|2 years ago
if we were still using Horse drawn carriages for travel, and someone invented the first Horseless Buggies today it would be banned and never allowed to advance at all. The amount of death and injury from the inception of the automobile would never be allowed in a new industry today. We have rationalized and assimilated the everyday human automobile into our lives, but refuse to accept any risk for something new
mikewarot|2 years ago
There is sufficient evidence that these are not ready for prime time.
unknown|2 years ago
[deleted]
tomohawk|2 years ago
The fire chief decided to have fire drills. Whenever a cop saw a car with diplomatic plates parked in front of a hydrant, they would call the fire department, who would come out and perform a drill. Upon seeing the vehicle blocking the hydrant, they would practice breaching through the vehicle to attain access to the hydrant. The vehicle would not survive in a drivable state.
The Soviets complained, but stopped blocking fire accesses.
EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
i_am_jl|2 years ago
I know this was tongue in cheek, but it underscores a problem I have with this article title.
The problems, despite the article's title, usually don't involve robotaxis vs trucks, but robotaxis vs active firefighting scenes. The article I read earlier this year referenced cars running over hoses and inching towards firefighters after being blocked by an active firefighting scene.
https://www.forbes.com/sites/bradtempleton/2023/01/31/san-fr...
FireBeyond|2 years ago
See: https://firematic.com/trucksnew/levittown19/2.jpg - 1 3/4" attack line on the right with plumbing, and 5" LDH (large diameter hose) supply line on the left.
kindatrue|2 years ago
bell-cot|2 years ago
Beyond instructing emergency responders "if it doesn't stay away, just smash its windows" (mentioned in the article, at n=1 scale)...I wonder if robotaxis have some convenient central depot, which might find its driveways block by emergency street repairs, or some such.
yardie|2 years ago
pbhjpbhj|2 years ago
This makes me feel a human operator probably would be no good at taking control ... however a computer that has access to more data (eg data from other vehicles), or that has more control (of other vehicles, of traffic lights, of pedestrian crossings, etc.) or more capabilities, might be far more successful than a person?!
nradov|2 years ago
rcme|2 years ago
wiihack|2 years ago
jb12|2 years ago
He drove off, and someone else got the spot.
jeffbee|2 years ago
jayrot|2 years ago
HankB99|2 years ago
I suppose the drawback to this strategy is that real harm has to happen first and that could easily involve loss of life or limb, but perhaps the threat of that would be enough to motivate the robo-taxi providers to fix the problem.
anigbrowl|2 years ago
JumpCrisscross|2 years ago
There haven’t been many cases of actual harm. This article pretty much lists out situations which could cause harm but didn’t, due to factors out of the operator’s or fire department’s control.
Sharlin|2 years ago
jokoon|2 years ago
It's difficult to see domains where AI can really improve productivity without having major drawbacks.
I'm not against research on AI, but as long as science cannot define what intelligence really is, I guess AI will not make major advances.
swayvil|2 years ago
kindatrue|2 years ago
[1] https://arstechnica.com/cars/2022/02/gm-seeks-us-approval-to...
mrobins|2 years ago
cirrus3|2 years ago
> The fire chief said each robotaxi company offers training to help deal with “bricked” vehicles. > “We have 160,000 calls a year. We don’t have the time to personally take care of a car that’s in the way when we’re on the way to an emergency,” she said.
junon|2 years ago
nycdatasci|2 years ago
vostrocity|2 years ago
JoeAltmaier|2 years ago
nordsieck|2 years ago
I understand that lag can be an issue, but if speeds are limited to 5-10 mph, that should be less of a problem.
mint2|2 years ago
And then do real world staged validation.
Actually why are extensive real world mock scenarios not running 24/7? If a car does something bad, do mock scenarios around it.
Please tell me they are doing these things because it’s crazy if they aren’t.
Reubachi|2 years ago
now, like anything else, does test/staging translate to production? Not even remotely. Automomous vehicle manufacturers claim to be in a monitoring/production phase to compare to their internal testing/staging. And over time we'll normalize Autonomous driving and the associated risks.
We used to drive all the nails in the world with a hammer precariously aiming at your own hand.
minwcnt5|2 years ago
maerF0x0|2 years ago
eg1: What about a direct line to a 24/7 operations center with cruise?
eg2: Or how about a proximal access console controller that allows them to take control of the the vehicle (with occupants consent if occupied) .
This neednt be a total ban on AVs
7952|2 years ago
It would be useful if vehicles with sirens/emergency lights/hazard lights could broadcast position and identity in a similar way to ADS-B. Make data open and allow third parties to forward, store and re-broadcast that data. Make the consumer of the data (such as a self driving car) responsible for how it uses that data.
srj|2 years ago
tpl|2 years ago
BurningFrog|2 years ago
This will have to be added somehow.
anigbrowl|2 years ago
crmd|2 years ago
inclusion of people with disabilities
improved transportation options for the disadvantaged
reduction of greenhouse gases; and
passenger safety
pixel3234|2 years ago
dylan604|2 years ago
tennisflyi|2 years ago
[deleted]