top | item 12011635

Tesla driver killed in crash with Autopilot active, NHTSA investigating

86 points| davidiach | 9 years ago |theverge.com | reply

67 comments

order
[+] NeutronBoy|9 years ago|reply
> In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it

I absolutely disagree with this, and it should not be used as a 'get out' clause from Tesla. If you work with non-technical people on technical issues on a day to day basis you'll understand why - non-technical people literally don't understand what stuff like this means. They'll read it, then say 'Oh but they installed it in my car anyway so it must be safe', and use it anyway.

[+] gshulegaard|9 years ago|reply
Beware clickbait and intentional inflammatory posturing. I noticed The Verge selectively quoted Tesla without directly linking to the source (although it was part of Elon's tweet):

https://www.teslamotors.com/blog/tragic-loss

The full context of the agreement:

"It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again."

So the key here is that Tesla autopilot is a driver assist but is positioned such that the driver needs to remain alert and able to assume control of the vehicle at any moment. So when it is said, "Neither Autopilot nor the driver noticed..." it is critical to note that ultimately the driver failed to control their vehicle. Personally, I find this reasonable.

[+] dogma1138|9 years ago|reply
They can put anything they want in the EULA it doesn't means it would stand up scrutiny in the court of law.

Autopilot features are in a grey area now since they are very new but Tesla is still responsible regardless of what they claim to be.

Seatbelts and Airbags were also in "beta phase" at some point, and more likely than not some people got killed by a deploying airbag in it's early days or by a seatbelt with a shoddy release function before everything was standardized but you could still sue for compensation in those incidents as well.

[+] netinstructions|9 years ago|reply
Tesla's statement[1] provides some more details

> What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

Failing to detect the white side of a tractor against a brightly lit sky is something I can see camera/image based sensors struggling with, but not LIDAR based sensors.

[1] https://www.teslamotors.com/blog/tragic-loss

[+] gshulegaard|9 years ago|reply
There was some discussion, but Tesla has a forward facing radar, but it is mounted lower than the camera and trailers of certain height have been shown to avoid it's detection window.

So, speculatively, it is a double failure:

1) Trailer was high enough to avoid radar

2) Trailer was light enough/low contrast to avoid detection by the camera

[+] at-fates-hands|9 years ago|reply
This is going to be a major morality issue with self driving cars.

What if a deer crosses the road? Will the car hit the deer to save the life of the driver? Avoid the deer to avoid an accident? Or save the deer and run into the ditch, possibly harming the driver?

Unfortunately, this technology is not capable of making those decisions and if ultimately the driver is responsible, then maybe we should just keep it that way then.

[+] growt|9 years ago|reply
I think the problem might have been that the middle of the trailer is not at ground level but at some hight where the lidar data is ignored or not there.
[+] joezydeco|9 years ago|reply
How does Tesla know the driver didn't see the trailer? He was killed in the crash. Is there a cockpit voice recorder?
[+] GrinningFool|9 years ago|reply
My sympathies to the people who built the systems that make Autopilot possible. Even knowing it was statistically bound to happen eventually, this has to weigh heavily from the "damn, I should've thought of this once-in-a-130-m-miles corner case" perspective.
[+] jholloway7|9 years ago|reply
Unlike many other fatal crashes, however, they presumably now have data from the corner case they can use to regression test every future release of the system.
[+] themgt|9 years ago|reply
A company's "public beta phase" system named "Autopilot" just drove a customer head-first into a trailer and killed them, at least in part due to inadequate sensor hardware that seems unlikely a software update can remedy. Tesla really ought to take a step back and consider the damage they may do to their brand if they start killing customers and blaming the dead person for having trusted a Tesla product.
[+] nathanaldensr|9 years ago|reply
I wonder what the driver was doing when the vehicle crashed. Tesla says that the driver is ultimately responsible for controlling the vehicle but then provides a system that is likely result in the driver not paying attention--sleeping, reading, or otherwise being distracted. How can Tesla and the governments that license drivers resolve this situation?
[+] cylinder|9 years ago|reply
Probably looking down at phone or the like. I don't buy that any sky would prevent an attentive driver from seeing a big rig perpendicular to them.

I've had doubts about autopilot after it was hastily released and I've seen some unnerving anecdotes on Tesla groups about it. I think Tesla needs to keep Elon in check a bit more...

[+] henrikschroder|9 years ago|reply
Any concession means that the driver somehow isn't ultimately responsible, which is a situation noone wants to end up in. It's also ridiculous.
[+] roadnottaken|9 years ago|reply
"This is the first known fatality in just over 130 million miles where Autopilot was activated," states a post on Tesla's corporate website. "Among all vehicles in the U.S., there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations."
[+] burnguy123|9 years ago|reply
From what I understand, the autopilot can only be used on highways. I would suspect that highway travel is much safer per mile in the first place. Are we comparing highway driving with autopilot to regular mixed city/highway driving?
[+] steve19|9 years ago|reply
Are users running Autopilot during hazardous conditions such as heavy rain and snow?

If not, comparing auto pilot deaths with average deaths per mile is disingenuous.

[+] techthroway443|9 years ago|reply
Should we be calling the technology Autopilot? I feel like the name implies more autonomy than it provides inadvertently misleading people.
[+] United857|9 years ago|reply
Tesla itself refers to it as Autosteer in the in-car UI (but Autopilot in its marketing materials).
[+] ibrahima|9 years ago|reply
I can see why Volvo criticized Tesla for rolling out a system that's not fully autonomous. If drivers are willfully distracted because they think the car will handle any situation, that's a very dangerous place to be in. Though it's not clear what the driver was doing in this situation, so it's possible that this would have been a fatal crash in any other car without autopilot as well.
[+] tlrobinson|9 years ago|reply
Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it

I'm happy to let you all beta test this one for me.

[+] wldcordeiro|9 years ago|reply
So if I read it right the trailer merged from a Lane next to the Tesla? So the trucker had the Tesla in a blind spot? The article is vague.
[+] tfinniga|9 years ago|reply
More information here: https://www.levyjournalonline.com/police-beat.html

Relevant section: In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi. The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash. The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A. When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence. The car smashed through two fences and struck a power pole. The car rotated counter-clockwise while sliding to its final resting place about 100 feet south of the highway. Brown died at the scene. Charges are pending.

Here is a google maps view of the accident location: https://goo.gl/maps/SSKyoxhoaxp

[+] dmoy|9 years ago|reply
Sounds to me like incoming traffic making a left turn, but again, vague.
[+] Jemmeh|9 years ago|reply
"the highway perpendicular" it says. Neither noticed it "against a brightly lit sky".

I take that as the truck was crossing the street (from a side street) in front of the car, but because of the sun blinding the sensors and the driver, the car didn't stop.