top | item 26852399

Two people killed in fiery Tesla crash with no one driving

384 points| bdcravens | 4 years ago |click2houston.com

736 comments

order
[+] vladoh|4 years ago|reply
Some people are quoting the recent Tesla safety report [1] as evidence that Autopilot is on average much safer than a human driver. This is a classic case of the Simpson's Paradox [2].

On the first look it seems that Autopilot is 4x safer than driving without any safety features (1 accident every 4.19 million miles vs 0.978 million miles). However, the data used to compute the stats is different in two important ways:

1. Autopilot cannot be always activated. This means that is some particularly difficult situations, the driver needs to drive himself. These are more dangerous situations in general.

2. If a driver disengages Autopilot to avoid an accident and engages it again straight away on a 10 miles drive, then you will have 9.99 miles driven on Autopilot without accident. The statistic misses the cases where the human driver intervened to avoid an accident.

This means that we are comparing the same measure (accidents) on different datasets and therefore in different conditions. This is dangerous, because it may lead us to wrong and often opposite conclusions (see Simpson's Paradox [2]).

I'm not saying that Autopilot isn't safer than a human driver, given that the driver is at the steering wheel and alert, but that this data doesn't lead to that conclusion. If the driver is not sitting at the driver seat, then it is certainly much more dangerous.

[1] https://www.tesla.com/VehicleSafetyReport [2] https://en.wikipedia.org/wiki/Simpson%27s_paradox

[+] riffic|4 years ago|reply
Just for the record, people who study the problem space concerning traffic safety have disavowed the word "accident" because it all too often dismisses the preventable root causes that can be learned from here.

context:

* https://laist.com/2020/01/03/car_crash_accident_traffic_viol...

* https://usa.streetsblog.org/2016/04/04/associated-press-caut...

* https://chi.streetsblog.org/2021/04/05/laspatas-ordinance-wo...

It'd be nice if folks here would be mindful of the role language plays. Here's also a preemptive "intention doesn't matter" because the first post I share covers that in the section "The Semantics of Intention", where it argues that the decisions have already been made in both the designs of our streets and in the choices people make behind the wheel, and those have known and changeable outcomes.

last edit I swear, but a good catchphrase I've seen recently that I'll be pinching is "Accidents happen, crashes don’t have to."

[+] jacquesm|4 years ago|reply
The marketing and messaging around auto-pilot simultaneously argues that auto-pilot is safer than a human driver but blames the driver when there is an accident.
[+] reissbaker|4 years ago|reply
I have read the criticism of how the Autopilot miles aren't apples-to-apples comparisons with national averages many times. However, this cherry-picks a single number from the safety report and ignores the other reported statistics. If the explanation for why Autopilot miles were so much safer than non-Autopilot miles is that people turn it off in dangerous situations — and thus equal or greater numbers of crashes were occurring for Autopilot users overall compared to the national average, they were just occurring when Autopilot was off — the crash rate without Autopilot engaged would have to be higher than the national average. Otherwise, where would the crashes go?

However, it isn't. The crash rate with Autopilot off (but with other safety features on) is about 4x better than the national average. And with all safety features turned off, it's still 2x better.

I don't think you can explain away the high safety record of Autopilot by claiming the crashes are concentrated in the non-Autopilot miles, because they aren't. While Autopilot miles are safer than non-Autopilot miles, non-Autopilot miles are no more dangerous than the national average (and in fact are less dangerous).

Autopilot+human is considerably safer than human alone.

[+] buran77|4 years ago|reply
As I like to point out to people when they quote this self driving statistic, student drivers have the best driving record out there. No fines, no accidents. Yet nobody would ever confuse a student driver for a good driver even if they are probably better than current self driving tech.
[+] anfilt|4 years ago|reply
There is also an other problem with only trying to better than average driver. If your system is only slightly better than average. That means basically 1 out of 2 people are better than your software.

Autonomous, driving should be multiple sigmas better than average. Especially given the reaction times that are possible for a computer vs a human.

If its only good as average a large amount of drivers would be safer to drive themselves.

Basically it should be as good if not better than the most capable drivers. Average is way to low of a bar to aim for.

[+] nnm|4 years ago|reply
The Tesla Safety Report is so misleading:

1. The accident rate does not take into account of drivers age, credit score and prior safety record, or the age / value of the car.

2. Most people only turn on autopilot when driving is easy (e.g. on a highway).

[+] Hamuko|4 years ago|reply
Does the safety report account for vehicle age and price? Because I imagine there's a difference in accident-free miles if you were to compare a new Mercedes-Benz S-Class to a 15-year-old Camry.
[+] pauljurczak|4 years ago|reply
> I'm not saying that Autopilot isn't safer than a human driver

I'm saying that Autopilot isn't safer than a human driver. The fatal accidents involving Autopilot so far, mostly crashing into stationary objects, were easily avoidable by a marginally competent driver. Tesla is playing with fire.

[+] cookiengineer|4 years ago|reply
I totally agree with your argument.

But playing the role of the devil's advocate here one might argue that the major benefit of autopilots is that the data will be accumulated to the set of happened accidents, so that they don't happen again in future.

When comparing accidents of manual vs automated driving, manual cases don't have any learning effect (let alone communication of it and availability of that to other human drivers). Automated driving on the other hand has the theoretical benefit, if it's openly shared, that all edge cases of untrained accidents go asymptotically to zero over time.

But in order to achieve that there must be a law that enforces the use of the same dataset, and that defines this dataset as public domain so that all autopilots can learn from that knowledge.

This was actually my initial hope for OpenAI. And oh boi have I been disappointed there. Nothing changed when it comes to reproducability of training or availability of datasets, and most research of OpenAI seems to be spent on proprietary concepts.

[+] emodendroket|4 years ago|reply
The problem of people overestimating the capability of the car or just losing their attention when Autopilot is engaged could easily wipe out whatever wins you do get.
[+] nuker|4 years ago|reply
Valid stat comparison will be average accident rate of all cars with autopilot vs same of all cars without, and with $30k - $50k current market value. This will equalise many things.

I, personally, wont trust some autopilot scripts, at least in this decade.

[+] jcoq|4 years ago|reply
I don't want to detail the conversation too much, or distract from the excellent points you have made... But how is this simpson's paradox?

Simpson's paradox is easier to understand geometrically.

https://en.m.wikipedia.org/wiki/File:Simpson_paradox_vectors...

L1 and L2 in the diagram have smaller slopes than B1 and B2, and yet their sum is higher. It's not hard to characterize when this happens. So the canonic example is that a drug might be more effective against all sub-cases (e.g. mild, severe illness) and yet appear less effective overall.

You, on the other hand, seem to be describing selection bias.

[+] buran77|4 years ago|reply
> Authorities tried to contact Tesla for advice on putting out the fire; it’s not clear whether they received any response.

This will become a massive issue in the years to come unless we find a way not only to drastically reduce the number of crashes but also massively improve reliability.

High voltage battery fires are probably the worst kind of fire a regular emergency responder would have to deal with, between the hard to put out fire and the risk of electric shock. It also causes some massive damage to the surroundings (the actual road surface, surrounding cars, or any garage unfortunate to house the car at that time).

Today very few emergency responders are even trained to properly deal with such a fire, and it's a topic really lagging behind everywhere compared to the rate EVs are popping up on the streets.

[+] Shank|4 years ago|reply
There's a public set of first responder guides with detailed diagrams for every make and model available. While I'm not sure they have a hotline, Tesla has always tried very hard to provide accurate information how to douse flames, which cables to cut to render the HV system disabled, and how to take care to ensure the car doesn't reignite. See: https://www.tesla.com/firstresponders (e.g., a specific guide: https://www.tesla.com/sites/default/files/downloads/2016_Mod...)
[+] underwater|4 years ago|reply
Tesla's and other electric cars have been around for a decade now. It's completely reasonable to expect fire departments to have trained their staff to deal with EV crashes.
[+] remarkEon|4 years ago|reply
What's the best way to actually put them out?
[+] testfoobar|4 years ago|reply
Thermal runaway in Lithium-ion battery packs is one reason that I don't ever want an EV parked inside my garage. These fires are hard to put out.
[+] agumonkey|4 years ago|reply
if solid state batteries materialize it will be such a boon
[+] jollybean|4 years ago|reply
“They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”

Hmm, that seems like a rather stark contradiction.

Elon I think has some character flaws and when people are dying it's not the time to be defensive. I'm one of the least naturally empathetic people I know and yet I wouldn't be talking about anything other than condolences.

Finally he can't continue to defend the term 'autopilot' - in Public Communications, you're talking the masses, the lowest common denominator, and the 'laziest mode' of even high functioning people - you gotta use words that will shape outcomes. 'Autopilot' is just a bad choice - he needs to change it.

[+] dm319|4 years ago|reply
I agree. People here will cite the aviation term not refering to autonomous flying, but if you ask a regular person in the street, they think that Tesla's are self-driving. This is a dangerous belief that is held by a lot of people and needs to change. However, Tesla knows this adds an intriguing cachet to the brand, so they seem reluctant to downplay it.
[+] kevincox|4 years ago|reply
> 'Autopilot' is just a bad choice

It depends on what your goals are. It is a widely known term, associated with Tesla and seen favorably. Sure, it is supremely misleading but in this case that seems like a feature.

If Tesla's goal was safety they wouldn't have shipped this feature in the first place. Instead they are aiming for luxury.

[+] mlindner|4 years ago|reply
He's defensive because the news media is talking about a product that can't be operated in the way the news media is describing. It's not physically possible. The car can't be operated with no one in the driver's seat. Autopilot shuts off automatically.
[+] blamazon|4 years ago|reply
For anyone else wondering why this is notable:

“Harris County Precinct 4 Constable Mark Herman told KPRC 2 that the investigation showed “no one was driving” the fully-electric 2019 Tesla when the accident happened. There was a person in the passenger seat of the front of the car and in the rear passenger seat of the car.”

[+] skynet-9000|4 years ago|reply
Some details in the article:

1) The police stated "no one was in the driver seat at the time of impact" of the Model S

2) Two people died in the subsequent fire from the crash into a tree

3) "it took firefighters nearly four hours and more than 32,000 gallons of water to extinguish the fire."

4) "At one point, crews had to call Tesla to ask how to put the fire out, Herman said."

5) "The owner, he said, backed out of the driveway, and then may have hopped in the back seat only to crash a few hundred yards down the road. He said the owner was found in the back seat upright."

Another link with some additional details: https://www.khou.com/article/news/local/tesla-spring-crash-f...

[+] nexuist|4 years ago|reply
Is there a particular reason why they had to fight the fire instead of waiting for it to put itself out? IIRC this is what Tesla itself recommends if there is no danger of spreading to nearby objects.
[+] maxharris|4 years ago|reply
People are commenting on this story without the benefit of all of the necessary facts.

1) Autopilot will not activate without lane lines on the road

2) FSD will not activate without lane lines either

3) The car was not equipped with FSD software

4) There were no lane lines on the road where this happened

https://twitter.com/WholeMarsBlog/status/1383855271710056460

[+] mrtksn|4 years ago|reply
So what's the conclusion here? Who needs to do the double check? The coroner, because the event did not happen thus the passengers are alive and stationary due to lack of a driver or the fire department because there's a driver on the drivers seat but they missed it?
[+] josephcsible|4 years ago|reply
Nitpick: there's a difference between "will not activate" and "will deactivate if it was already active". (For the record, my opinion is that this was entirely the driver's fault and not Tesla's fault at all.)
[+] tyingq|4 years ago|reply
Autopilot also supposedly wants the seat belt buckled and weight on the seat, right?
[+] dijit|4 years ago|reply
Article says it’s unconfirmed whether the car was in auto-drive. Part of me (without any knowledge) thinks someone was showing off the auto-drive and turned it off accidentally. But more details will come out I hope.

One thing in particular sticks out as concerning: the fire service did not know how to deal with the fire.

That’s not something specific to Tesla, Tesla does not make all battery powered cars, the fire service should know how to suppress electrical fires.

[+] ChrisClark|4 years ago|reply
I once unhooked my belt to take off my jacket while on autopilot. It immediately started screaming at me, disabled autopilot and started slowing down gradually.

I've also heard it uses the seat sensor to do the same. Unless they've found a way to bypass multiple safety features, then the car wasn't in autopilot.

[+] ec109685|4 years ago|reply
Auto pilot immediately switches off if it doesn’t sense pressure in the seat, which would result it tons of beeping and the car slowing down and moving to side of road.
[+] powderpig|4 years ago|reply
Lithium-ion fires are hard to extinguish, especially with thermal runaways. There are flame retardant products that can extinguish lithium-ion fires, Class D extinguishers can be used.

I would guess the fire crews that responded were not equipped with this type of extinguisher.

[+] Nacdor|4 years ago|reply
> “[Investigators] are 100-percent certain that no one was in the driver seat driving that vehicle at the time of impact,” Harris County Precinct 4 Constable Mark Herman said. “They are positive.”

This would only be possible if they were using the autopilot feature.

> the fire service did not know how to deal with the fire.

Tesla's advice is "let it burn":

> Tesla’s guidance suggests it’s better to let the fire burn out than continuing to try to put it out.

[+] jonnycomputer|4 years ago|reply
There are a bunch of more pertinent things to say, but I was struck by just how thoroughly demolished that car looked, and by the fact that it took 32k gallons of water to put the fire out because the battery kept re-igniting. I'm no expert on what typical high-speed crashes look like, but this seems ... problematic.
[+] websites2323|4 years ago|reply
Something in this story doesn’t add up. There is no implementation of AutoPilot on Tesla cars that doesn’t require intervention by the driver every 15 seconds. Perhaps the driver undid their seat belt and reached behind to get something and was then thrown from his seat elsewhere?
[+] dm319|4 years ago|reply
When automation does a 95% job, sometimes it isn't worthwhile using it because of the overrides required for the extra 5%. If you require full concentration while using driving assist, it might actually be easier to just drive the car regularly or you'll struggle to maintain that ability to intervene immediately when required.
[+] throwawayboise|4 years ago|reply
Two men ... nobody driving.

This is a "hold my beer and watch this" accident.

[+] cybert00th|4 years ago|reply
On the face of it, it appears we're putting waaaay too much trust in technologies that aren't anywhere near the levels of autonomy we expect them to have.

Whether that's an issue of the relevant car companies not communicating with their customers properly; or their customers being ignorant - or worse still, wilfully negligent, is probably something for the courts to decide.

Either way, it seems to me an extended period of reflection over what we as human beings are currently doing to one another with these new-fangled driving technologies, is needed.

[+] everdrive|4 years ago|reply
"Authorities said they used 32,000 gallons of water over four hours to extinguish the flames because the vehicle’s batteries kept reigniting. At one point, Herman said, deputies had to call Tesla to ask them how to put out the fire in the battery."

Wow. Have there been other electric car fires which are so difficult to put out?

[+] de6u99er|4 years ago|reply
>The owner, he said, backed out of the driveway, and then may have hopped in the back seat only to crash a few hundred yards down the road. He said the owner was found in the back seat upright.

How is it possible, that the car doesn't stop immediately as soon ad there's noone in the driver seat?

[+] utopcell|4 years ago|reply
From [1]:

Bloomberg asked Urmson about Tesla's Autopilot technology—and particularly Elon Musk's claim that Tesla vehicles will soon be capable of operating as driverless taxis.

“It’s just not going to happen,” Urmson said. “It’s technically very impressive what they’ve done, but we were doing better in 2010.”

That's a reference to Urmson's time at Google. Google started recruiting DARPA Grand Challenge veterans around 2009. Within a couple of years, Google's engineers had built a basic self-driving car that was capable of navigating a variety of roads around the San Francisco Bay Area.

A couple of years later, Google started letting employees use experimental self-driving cars for highway commutes—an application much like today's Autopilot. Google considered licensing this technology to automakers for freeway driving. But the technology required active driver supervision. Urmson and other Google engineers decided there was too great a risk that drivers would become overly reliant on the technology and fail to monitor it adequately, leading to unnecessary deaths.

[1] https://arstechnica.com/cars/2021/04/the-largest-independent...

[+] tyingq|4 years ago|reply
What's really odd is that it happened about midway onto a very short ( < 300 meters) dead-end, cul-de-sac street where you have to take a hard right turn to enter. The address of the place is in the story...look at it on a map. Really odd.
[+] leesec|4 years ago|reply
I'm not a car safety specialist or anything but I think someone should have been driving.
[+] irrational|4 years ago|reply
Did the car start driving with nobody in the driver's seat? Or did someone move out of the driver's seat after it started?