top | item 28633640

New video shows Tesla on autopilot almost hitting state trooper

40 points| camjohnson26 | 4 years ago |clickorlando.com

38 comments

order
[+] g_p|4 years ago|reply
I wonder if Tesla FSD is properly considering a windowed sample of images at a high enough sample rate to see modern high intensity strobe lights used on emergency vehicles.

Clearly the dash cam used is sampling at a high enough rate to see them, but perhaps the Tesla isn't considering the a view of the road ahead across a wide enough period of time to see the hazard lights and understand the vehicle is potentially stationary and alerting drivers about a hazard.

A very naive FSD algorithm working only from the current frame (without looking at past frames) could certainly mess this up, but at what point do we accept this simply isn't fit for purpose - it can't avoid hitting a stationary vehicle, so what hope does it have with slowly moving and other edge cases?

[+] dont__panic|4 years ago|reply
I'm curious how Tesla FSD works with obstacles that aren't cars or people on roads. Where I drive, I often have to slam on my brakes to avoid hitting deer, turkeys, small rodents, running over squashed skunks, etc. I assume that Tesla FSD would notice some of these obstacles, but it wouldn't be able to notice a deer on the edge of the road and identify it as slowdown-worthy... would it?

Curious if any rural Tesla owners can chime in on this -- it's not exactly a documented feature.

[+] Narkov|4 years ago|reply
Forget flashing lights. Why would the car not stop or even slow down when a large immovable object is within impact distance? Auto-brake has been around for a decade now.
[+] cmsj|4 years ago|reply
You make a very good point. I've seen videos from Tesla's AI days where they show debug bounding boxes around identified objects, and the boxes always seem to disappear when objects are occluded. Perhaps that's just a visualisation choice, but I'm taking the precautionary approach of assuming it means that the car has no object permanence because it's not considering a wide enough time window.
[+] rpmisms|4 years ago|reply
They just released an Emergency Vehicle Detection feature on basic autopilot, and so far it works.

This is a big part of the rewrite of autopilot, the ML they use can now persist data through time, so it can reasonably predict if occluded objects will remain in place.

[+] t0mas88|4 years ago|reply
Slowly moving is much easier to a lot of systems than stationary. An easy way to distinguish a car from the guard rails or other fixed objects is to look at movement, most radar systems do this.
[+] Doxin|4 years ago|reply
I mean sure, autopilot bad. No argument from me there. But how in the world are you in the drivers seat of a car and not see a car stopped in the middle of the road with flashing lights?

The whole "you need to pay full attention when using autopilot" argument is nonsense, but surely people aren't turning on self driving and then just checking out mentally entirely? Do people really trust this tech enough to not even keep an eye on the road?

[+] themodelplumber|4 years ago|reply
I didn't realize from the link title that it also collided with the stopped vehicle. Wow.

It made me wonder, are the driving models trained at all to recognize accidents or other road anomalies? A plane making an emergency landing in front of them? Just curious.

Edit: Would also be interested to know if there's a way police could work with Tesla or other autopilot carmakers to broadcast special codes like don't-hit-us-here-is-our location, or just send a signal blocking autopilot, or whatever. They already send texts and radio alerts, and there are already local AM alert stations in the US anyway.

If my kids would be driving, I'd rather buy from that kind of manufacturer.

[+] user_named|4 years ago|reply
I think that by even asking the question if the system has been trained to recognize accidents you're giving it too much credit.

It simply does not recognize objects that are not moving. This is not a smart system but something very dumb. As for accidents, it will never be able to understand visually what constitutes an accident site.

[+] g_p|4 years ago|reply
> Would also be interested to know if there's a way police could work with Tesla or other autopilot carmakers to broadcast special codes like don't-hit-us-here-is-our location, or just send a signal blocking autopilot, or whatever. They already send texts and radio alerts, and there are already local AM alert stations in the US anyway.

I don't think this will be a good solution, since inherently these incidents happen when there's an "exception". If every stationary emergency vehicle had to send a notification each time the lights were activated, this would maybe work, until you're in an area without coverage.

To me, we need to raise the bar for what's acceptable - this vehicle would presumably also have ploughed into an injured motorist's car before the police had arrived... The root problem isn't the lack of notice of the stopped emergency vehicle, rather it's the apparent inability to avoid a stationary vehicle on the roadway, which should be the very first (and simplest) thing to detect, since the self driving vehicle is closing in on it and can see its relative position isn't moving. If it can't do that, IMO it shouldn't be on a public road, as it isn't capable of basic speed management in traffic without being able to gauge closing speed against a vehicle ahead.

[+] iamevn|4 years ago|reply
> Edit: Would also be interested to know if there's a way police could work with Tesla or other autopilot carmakers to broadcast special codes like don't-hit-us-here-is-our location, or just send a signal blocking autopilot, or whatever. They already send texts and radio alerts, and there are already local AM alert stations in the US anyway.

I think that's solving the wrong problem, as the rest of your comment implies it's just as important to not hit other stationary objects like, say, another Tesla that has decided it's going to stop in the middle of the road[1].

[1]: https://twitter.com/repkord/status/1440908464625106946

[+] melling|4 years ago|reply
How about a generic anti-collision with a large solid object? Radar? LiDAR? Nice safety feature.
[+] obilgic|4 years ago|reply
Just got my Plaid 2 days ago. I already feel uncomfortable about how it changes lanes and merges into lanes. Def making other drivers notice something is up with the way it steers, especially during traffic. Not even talking about city streets.

Also noticing that other drivers leave a lot more space instantly, and let me pass as soon as the turn signal is on, not sure if they are worried about it being on Autopilot.

[+] thebruce87m|4 years ago|reply
100% Teslas fault here.

BUT:

Emergency vehicles should be hooked in to a crowd-sourced traffic system now. The first thing they should do is hit a big button on their dashboard that says “Road Closed”, “lane blocked” etc which should then update Google maps etc and also disable things like autopilot for vehicles approaching the incident.

This goes for things like roadworks too.

A bit like Waze but the information comes from “official” sources and is required by law to be logged.

[+] thomaskcr|4 years ago|reply
I feel like 90% of modern cars would have stopped for the object blocking the lane. Even if the police weren't there, then it would have just smashed into the car stopped on the highway. The fact it didn't recognize the police car isn't the issue, it's that Tesla's are basically blind to anything not moving.
[+] dragontamer|4 years ago|reply
We can't even get police departments to agree to keep their body cameras on. Any such requirement you'd pass to officers like that would get wrecked by the police unions / F.O.P.

Our police aren't organized at a national level in the USA: its a collection of 10,000+ departments, some of which are incredibly shallow (say 1 sheriff elected by the locals and 5 deputies).

------

To practically start something like that, you find police departments that are likely to agree with each other (ex: NYPD + Boston PD), and then maybe get an agreement between their Commissioners. You grow the alliance one step at a time. You're gonna have to figure out "why would the police want a system like this in their procedures?", such as arguing for a mitigation of accidents or whatever. Or maybe giving the police some kind of power (ex: official announcement to close a road and redirect traffic)

[+] cmsj|4 years ago|reply
That's not a bad sounding idea, but it makes a lot of assumptions about the infrastructure available in the region the self-driving car is being deployed.

If the goal is really to make cars that can drive as well as humans, they need to be able to do that based only on on-board sensors.

[+] rpmisms|4 years ago|reply
> 100% Teslas fault here.

100% the driver's fault.

[+] rvz|4 years ago|reply
This thing needs to investigated given that lots of these drivers are being too complacent with this auto-pilot beta software functionality.

It has proven to be very unsafe to use on the roads and is putting the lives of other drivers at risk.

[+] rpmisms|4 years ago|reply
Teslas crash less often than other cars, even less when Autopilot is enabled. You just hear about it more often because it's actually news when it happens, as opposed to some idiot texting while driving in a Camry.
[+] wil421|4 years ago|reply
How hard would it be given that Tesla uses cameras to just disable Autopilot when it senses flashing lights? Or at least give the driver a few seconds to react before disabling Autopilot.
[+] toomuchtodo|4 years ago|reply
Released two days ago:

New in #Tesla 2021.24.12 Owners Manual for #Model3 #ModelY "If Model 3/Model Y detects lights from an emergency vehicle when using Autosteer at night on a high speed road, the driving speed is automatically reduced and the touchscreen displays a message informing you of the slowdown. You will also hear a chime and see a reminder to keep your hands on the steering wheel. When the light detections pass by or cease to appear, Autopilot resumes your cruising speed. Alternatively, you may tap the accelerator to resume your cruising speed. Never depend on Autopilot features to determine the presence of emergency vehicles. Model 3/Model Y may not detect lights from emergency vehicles in all situations. Keep your eyes on your driving path and always be prepared to take immediate action."

https://threadreaderapp.com/thread/1440537502272536581.html

Vehicle firmware revision uptake stats for vehicles using Teslascope [1] and TeslaFi [2].

[1] https://teslascope.com/teslapedia/software

[2] https://www.teslafi.com/firmware.php

[+] BugsJustFindMe|4 years ago|reply
Smashing into obstacles is bad whether they have flashing lights or not. Sensing flashing lights is not the solution to smashing into obstacles.