top | item 34347778

New video of Tesla crash demonstrates the problem of semi-automated driving

297 points| frxx | 3 years ago |theautopian.com | reply

664 comments

order
[+] maury91|3 years ago|reply
I'm trying to reason "why the Tesla did stop?", in the second video on TheIntercept website, I can notice a left exit before the point where the Tesla stopped. Something I noticed with Google maps ( I don't have a Tesla, and maybe the navigation system of the Tesla is similar ), is that sometimes thinks tunnels can access the road above them ( in a tunnel near my home Google maps always tells me "turn right" in the middle of the tunnel without understanding that I need to get out of the tunnel first and take the ramp ). With this information I think the navigation system of the Tesla could have been similarly stupid and tried to turn left on a ghost road just to discover there's no road and so decided to stop.

This is just a possible reason, and I'm definitely not suggesting that this is what did happen.

This is Google maps saying "go right" in the middle of a tunnel: https://goo.gl/maps/G89cyQT2APUQuu6g6 ( the two roundabouts are connected by a tunnel )

PS: If you use street view you will see how it was 3 years ago, before the construction of the tunnel

[+] somethoughts|3 years ago|reply
I think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers. [Edit] From the video the Tesla is making a lane change and stopping simultaneously which means there could be a case of the Tesla FSD/driver doing a unsafe lane change.[1]

Most of the time FSD just wrecks the Tesla itself or injures the driver of the Tesla (i.e. running into trees/dividers, running into much heavier freight trucks).

It will be interesting if Tesla comes in to provide monetary support for proving the legal case that Tesla FSD is not at fault or the Tesla driver (and his insurance) will be left to fend for themselves.

In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc.

But the long term effects of not legally supporting any driver with Tesla FSD accidents will be that new customers won't trust this $10000 upsell product offering that's highly profitable for Tesla.

I could also see 3rd party (non-Tesla) insurance companies refusing to sell coverage to Tesla FSD drivers.

It could also make Tesla 1st party insurance also untrustworthy to customers and could become a huge liability for Tesla.

It seems like it will be a great litmus test to see if Tesla has the guts to step up for its own product.

[1] First video shows a potential unsafe lane change https://theintercept.com/2023/01/10/tesla-crash-footage-auto...

[+] kart23|3 years ago|reply
> think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers.

Autopilot has killed multiple motorcyclists, and is suspected in many other cases, totaling 19 fatalities. This isn't the first, our regulatory bodies are just incredibly slow at this.

https://arstechnica.com/cars/2022/08/tesla-faces-new-probes-...

https://www.theverge.com/2022/7/27/23280461/tesla-autopilot-...

[+] toast0|3 years ago|reply
> It will be interesting if Tesla comes in to provide monetary support for proving the legal case that Tesla FSD is not at fault or the Tesla driver (and his insurance) will be left to fend for themselves.

> In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc.

> But the long term effects of not legally supporting any driver with Tesla FSD accidents will be that new customers won't trust this $10000 upsell product offering that's highly profitable for Tesla.

Tesla publicly disparages people who died relying on their products, and refuses to cooperate with the NTSB. I'd expect nothing less in this case. Somehow that hasn't been a big factor in sales.

[+] justapassenger|3 years ago|reply
It’s very unlikely it’s a first time FSD caused accident for others. People using FSD allow it to do reckless things, just to see “if it’ll figure it out”. You can cause accident, without being involved in it, especially if you drive in unpredictable way.

Like this one - if driver would stop sudden breaking and moved forward, cars behind him could have still crashed.

That’s the thing about testing on the public roads - there are many ways you can affect other users.

[+] Alex3917|3 years ago|reply
> I think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers.

In a pileup like this it's basically never the fault of the front car, unless maybe if they are purposely causing the accident for insurance fraud or something. Maybe the driver will get cited for failing to maintain the minimum speed, but legally this isn't much different than if someone backed into the Tesla while it was parked in a parking garage.

[+] janalsncm|3 years ago|reply
That would be very interesting. Stand behind your product. If FSD becomes a public nuisance they will quickly become uninsurable or worse.

I was thinking today about the Southwest disaster, not only for customers but for the company’s reputation. But I know a great way to win it back: cash. Promise it won’t happen again, but if it does, offer best in industry cash compensation. Prove that your company gives a shit. I will be very disappointed if they expect time alone to heal this.

[+] ClumsyPilot|3 years ago|reply
> In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc

Imagine there is an 'autopilot' gun, you buy it, and it comes with the contract that says you take full responsibility got the gun.

Then it shoots me and kills me before you have a chance to react.

The prosecutor will go after the manufacturer. If manufacturer wrote code that kills me, you and any contract you signed is not even relevant.

You cannot contract away criminal responsibility. Otherwise I could contract away all my responsebilities to a random homeless guy.

[+] enslavedrobot|3 years ago|reply
FSDbeta is not enabled on highways. It's not clear to me that it's even possible to be on FSD in that tunnel.

Interestingly, the article is careful to say that the driver "claims" it was on FSDbeta.

More to this story.

[+] modeless|3 years ago|reply
FSD does not activate on freeways. This is not FSD. It's the same Autopilot that has been in use for many years.
[+] Animats|3 years ago|reply
Is there enough info yet to know if the lane change was initiated automatically? That's apparently possible. Tesla support site:

Auto Lane Change

To initiate an automated lane change, you must first enable Auto Lane Changes through the Autopilot Controls menu within the Settings tab. Then when the car is in Autosteer, a driver must engage the turn signal in the direction that they would like to move. In some markets depending on local regulations, lane change confirmation can be turned off by accessing Controls > Autopilot > Customize Navigate on Autopilot and toggle ‘Lane Change Confirmation’ off.

[+] gundmc|3 years ago|reply
I believe the car was in "Full Self Driving" and not "Autopilot"?
[+] sebastianconcpt|3 years ago|reply
This blind trust in technologies is very very wrong. We are part of the system and things should always be designed to keep us as powerful fallbacks/system-degradation resource.
[+] herodotus|3 years ago|reply
I have a VW Golf and it has what it calls "adaptive cruise control". What makes it different from my older (simple) cruise control is that it will slow down and speed up again as necessary without my intervention. For example, if I set the speed to 70 mph and it gets close to a car ahead that is only going 65 mph, it will slow down and maintain what the software believes is a safe distance. Similarly, if a car in an adjacent lane changes lanes in front of me, it will slow down if necessary.

I do use it, but less than I did the old system: I just do not find it relaxing because I cannot really grasp intuitively when I need to override it. On standard cruise control, it was obvious to me when I needed to take over. Therefore I am more rather than less vigilant than I was with the old system.

I don't want to be too hard on the Golf: it has other safety features I really like, such as lane assist and automatic breaking. But I am not a fan of the adaptive control, and I think the article help me understand why: its a level 2 problem!

[+] nailer|3 years ago|reply
FTA: > the system is called Full Self-Driving

Also FTA quoting Tesla (https://images-stag.jazelc.com/uploads/theautopian-m2en/repo...):

> It does not turn a Tesla into a self-driving car

Is it self driving or not?

[+] capableweb|3 years ago|reply
Systems where we try to "outsource" the piloting of a vehicle seems to have a big problem with naming. "Auto Pilot" in planes does not mean that the plane pilot itself automatically, it just means you can pilot "less" than if you have 100% manual control.

Seemingly Tesla is copying this, with "Self-driving" doesn't actually mean the car will drive by itself, but that the driver can drive "less" compared to before.

Deceptive marketing at best, fatal at worst.

[+] mschuster91|3 years ago|reply
There is one additional part in play: the drivers following the Tesla clearly didn't keep their distance. Normally, at least here in Germany, drivers are supposed to keep enough distance from those driving in front of them that, even in the situation that something like a brake defect forcing the car in front to a complete stop or a truck blowing a tire, they do not crash into it.
[+] witheld|3 years ago|reply
They weren’t following the tesla, it moved into their lane.
[+] trabant00|3 years ago|reply
> if the Level 2 paradigm was flipped, where the driver was always in control, but the semi-automated driving system was doing the monitoring, and was ready to take over [...] but would be less sexy in that the act of driving wouldn’t feel any different

I think this is the the most important point of the article and largely ignored here in the comments who seem to focus mostly on who was to blame for this specific accident.

We know the strengths and weaknesses of both humans and tech at this point in time. Humans are overall better decision makers but aren't 100% focused 100% of the time. Tech gets confused a lot but is never tired or inattentive. So if your goal is safety you let the humans drive but take over in emergency situations when the human is not reacting. Which is what most car manufacturers do right now. Letting the tech drive and expecting the human to provide perfect reaction time every time the tech fails is playing on the weaknesses of both. This is focusing on cool marketing at the expense of safety.

[+] yoden|3 years ago|reply
> doing the monitoring, and was ready to take over

This isn't even uncommon. Almost every Honda sold for a while has been a L2 system that will take over in certain ways if the car believes a crash is imminent, such as a car suddenly breaking.

> but would be less sexy

It's so less sexy people don't know that millions of vehicles are sold this way...

[+] 93po|3 years ago|reply
If there are fewer accidents per X miles with FSD, then I don't see how you can claim anything is at the expense of safety.
[+] zestyping|3 years ago|reply
Tesla is primarily at fault for deceptively naming the function "Full Self Driving". It is indefensible mendacity.

I do not understand why the company has not already been sued into oblivion for an obvious lie that has killed people.

[+] roguecoder|3 years ago|reply
Video understanding engineer here.

Tunnels and underpasses are the worst. They are a pain in the ass, because shadows mess with all the edge detection and motion models and anything else visual. Humans compensate by thinking "I'm in a tunnel: things are weird." But without a reasoning model that can take context into account, the computer is stuck.

In the video from behind, you can see the shadow ahead of the car on the floor of the tunnel that it carefully stops just before it would hit. A person would notice that EVERY OTHER car had driven straight through the thing it thought was an obstacle, but that is also context this car isn't going to take into account.

[+] sokoloff|3 years ago|reply
Interesting observation.

I worked on autonomous vehicles (in vision) at Daimler in 1991. During one of our test sessions, on drying pavement, the vehicle abruptly slammed on the brakes and refused to proceed past a point where the vision system could see a symmetric about a centerline set of horizontal edges on the pavement. It tightly fit our (hand-coded, being 1991) model for a car ahead. We had to revert to manual control and drive back to our staging area and wait for the track (set of runways and taxiways at a disused airbase) to finish drying.

Obviously, the state of the art has significantly improved since then, but some fundamental risk of misinterpretations could easily remain.

[+] dkjaudyeqooe|3 years ago|reply
Is this sort of problem an argument for lidar or similar? I'm assuming you're describing a camera only system.
[+] somedude895|3 years ago|reply
There are so many tunnels in Switzerland, that using AP is a huge pain. But at least it keeps me attentive and ready to take over at all times.
[+] jayd16|3 years ago|reply
This is an honest question here but why is the pile up so bad if driving conditions were ideal? The next 6 drivers were not keeping a safe distance or paying attention?
[+] MBCook|3 years ago|reply
Have you watched the video? Its freeway speeds, a decently busy freeway, at a tunnel entrance that’s curving. Plus the car switched to the passing lane and abruptly stopped.

This wasn’t going straight on a highway stopping in the same lane the driver was already in.

I can understand how the conditions probably made it worse.

[+] r0m4n0|3 years ago|reply
I think it’s mostly caused by the folks making abrupt lane changes as the Tesla slows down. You can see people start to swerve into other lanes instead of just slowing down (they are looking in their side mirrors to avoid the slow car and sometimes even accelerate). I’ve had this happen where you are cruising at a speed and the person in front of you serves to avoid a slow car and now you lose all of the time to react by changing lanes as well

I’d even wager if that pickup hadn’t swerved and everyone just slowed down it wouldn’t have piled up

Side tangent, I love watching car crash videos. Really interesting to see how the system breaks down and people make mistakes. I spend hours on YouTube sometimes :)

[+] amf12|3 years ago|reply
Honestly, when you're driving, you sometimes only see the car in front of you. You might see it slowing down first, and you start slowing down, or continue because there is enough distance anyway. Then the car in front of you stops abruptly or crashes. Now you panic and brake hard. Maybe you still crash into the car in front. Even if you stop before crashing, this hard braking causes the car behind you to crash into you.

TL;DR - Sometimes the 3 second following distance is just not enough even if someone is paying attention because they can only see the car in front of them.

[+] beeforpork|3 years ago|reply
A good analysis. The stated vigilance problem is well known, actually, but it is ignored, supposedly in the hope that humankind is on a progressive path to fully autonomous self-driving, and that we need this phase of experimentation to advance technology.

Completely autonomous self-driving cars (without any steering wheel, so even incapacitated or clueless people may 'drive' (like drunk or in labour or children)) indeed seem like a good solution. (Except we need less individual traffic for env reasons.) Unfortunately, the problem is very hard, technologically, and the current interim solutions will stay for a while.

[+] threatofrain|3 years ago|reply
Vigilance would be hard in this scenario because the Tesla changed lanes and then suddenly braked. My vigilance would be focused on whether I'm about to hit something or make an inappropriate merge, but I wouldn't have expected a sudden brake.
[+] aseerdbnarng|3 years ago|reply
I’m always curious how different insurers cover the use of hands-free driving. Would anyone still buy the FSD feature if people thought insurers would reject accidental damage claims? It feels like its sitting on sketchy ground
[+] osigurdson|3 years ago|reply
I think the insistence on having data for every possible situation for training purposes is indicative of the problem. Humans only require a small amount of training and can extrapolate this to many situations.
[+] ajross|3 years ago|reply
This is really very wrong. Human beings create exactly this kind of accident every day. There is absolutely nothing remotely"non-human" about cutting someone off and stopping in traffic.[1] Just go check /r/idiotsincars for an hour to see much, much worse.

[1] In fact, this is so unlike FSD's behavior that I still think it's more likely that it will turn out not to have been in use at all. The only evidence at hand is one sentence in a police report that the police themselves state was unvalidated. How easy would it be to blame the car as an excuse?

[+] twizod|3 years ago|reply
Can someone please explain to me why the system cannot tell you why it stopped? What prevents the program from explaining why it performs certain actions?
[+] redox99|3 years ago|reply
I'm sure Tesla has logs. Also in the visualization when it's slowing for something it shows such object in blue or red.
[+] aliswe|3 years ago|reply
Very common in machine learning.
[+] MadQ|3 years ago|reply
Obviously, the Tesla FSD Autopilot made a big mistake, but didn't the driver react? If my car slows from 40 to 0 MPH in 5 Seconds, and there is no obstacle in front of me, I must assuredly would step on the accelerator.

Not that FSD Autopilot is what it's marketed as, but this is the responsibility of the driver and not the car.

[+] nextstep|3 years ago|reply
This story also highlights just how little Elon Musk actually supports free speech.

Since fighting for release of this video and publishing the story, Ken Klippenstein has been censored on Twitter through shadow bans and inability to find his profile through search.

https://twitter.com/stevanzetti/status/1613295292283236358

[+] tinus_hn|3 years ago|reply
In other news, no car has ever seized up and stopped at an inconvenient moment before Tesla made this horrible mistake, and keeping a safe distance would not be necessary if it weren’t for those dang automated drivers. Now get off my lawn!
[+] cm2187|3 years ago|reply
From what I can tell, the car behind the tesla didn't crash into the tesla but stopped before. And so did the car behind that one. It is the cars after that that crashed into each others.

While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances and not able to stop when there is a traffic jam ahead.