> In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it
I absolutely disagree with this, and it should not be used as a 'get out' clause from Tesla. If you work with non-technical people on technical issues on a day to day basis you'll understand why - non-technical people literally don't understand what stuff like this means. They'll read it, then say 'Oh but they installed it in my car anyway so it must be safe', and use it anyway.
Beware clickbait and intentional inflammatory posturing. I noticed The Verge selectively quoted Tesla without directly linking to the source (although it was part of Elon's tweet):
"It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again."
So the key here is that Tesla autopilot is a driver assist but is positioned such that the driver needs to remain alert and able to assume control of the vehicle at any moment. So when it is said, "Neither Autopilot nor the driver noticed..." it is critical to note that ultimately the driver failed to control their vehicle. Personally, I find this reasonable.
They can put anything they want in the EULA it doesn't means it would stand up scrutiny in the court of law.
Autopilot features are in a grey area now since they are very new but Tesla is still responsible regardless of what they claim to be.
Seatbelts and Airbags were also in "beta phase" at some point, and more likely than not some people got killed by a deploying airbag in it's early days or by a seatbelt with a shoddy release function before everything was standardized but you could still sue for compensation in those incidents as well.
> What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.
Failing to detect the white side of a tractor against a brightly lit sky is something I can see camera/image based sensors struggling with, but not LIDAR based sensors.
There was some discussion, but Tesla has a forward facing radar, but it is mounted lower than the camera and trailers of certain height have been shown to avoid it's detection window.
So, speculatively, it is a double failure:
1) Trailer was high enough to avoid radar
2) Trailer was light enough/low contrast to avoid detection by the camera
This is going to be a major morality issue with self driving cars.
What if a deer crosses the road? Will the car hit the deer to save the life of the driver? Avoid the deer to avoid an accident? Or save the deer and run into the ditch, possibly harming the driver?
Unfortunately, this technology is not capable of making those decisions and if ultimately the driver is responsible, then maybe we should just keep it that way then.
I think the problem might have been that the middle of the trailer is not at ground level but at some hight where the lidar data is ignored or not there.
My sympathies to the people who built the systems that make Autopilot possible. Even knowing it was statistically bound to happen eventually, this has to weigh heavily from the "damn, I should've thought of this once-in-a-130-m-miles corner case" perspective.
Unlike many other fatal crashes, however, they presumably now have data from the corner case they can use to regression test every future release of the system.
A company's "public beta phase" system named "Autopilot" just drove a customer head-first into a trailer and killed them, at least in part due to inadequate sensor hardware that seems unlikely a software update can remedy. Tesla really ought to take a step back and consider the damage they may do to their brand if they start killing customers and blaming the dead person for having trusted a Tesla product.
I wonder what the driver was doing when the vehicle crashed. Tesla says that the driver is ultimately responsible for controlling the vehicle but then provides a system that is likely result in the driver not paying attention--sleeping, reading, or otherwise being distracted. How can Tesla and the governments that license drivers resolve this situation?
Probably looking down at phone or the like. I don't buy that any sky would prevent an attentive driver from seeing a big rig perpendicular to them.
I've had doubts about autopilot after it was hastily released and I've seen some unnerving anecdotes on Tesla groups about it. I think Tesla needs to keep Elon in check a bit more...
"This is the first known fatality in just over 130 million miles where Autopilot was activated," states a post on Tesla's corporate website. "Among all vehicles in the U.S., there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations."
From what I understand, the autopilot can only be used on highways. I would suspect that highway travel is much safer per mile in the first place. Are we comparing highway driving with autopilot to regular mixed city/highway driving?
I can see why Volvo criticized Tesla for rolling out a system that's not fully autonomous. If drivers are willfully distracted because they think the car will handle any situation, that's a very dangerous place to be in. Though it's not clear what the driver was doing in this situation, so it's possible that this would have been a fatal crash in any other car without autopilot as well.
Relevant section:
In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi.
The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash.
The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A.
When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence. The car smashed through two fences and struck a power pole. The car rotated counter-clockwise while sliding to its final resting place about 100 feet south of the highway. Brown died at the scene.
Charges are pending.
"the highway perpendicular" it says. Neither noticed it "against a brightly lit sky".
I take that as the truck was crossing the street (from a side street) in front of the car, but because of the sun blinding the sensors and the driver, the car didn't stop.
[+] [-] NeutronBoy|9 years ago|reply
I absolutely disagree with this, and it should not be used as a 'get out' clause from Tesla. If you work with non-technical people on technical issues on a day to day basis you'll understand why - non-technical people literally don't understand what stuff like this means. They'll read it, then say 'Oh but they installed it in my car anyway so it must be safe', and use it anyway.
[+] [-] gshulegaard|9 years ago|reply
https://www.teslamotors.com/blog/tragic-loss
The full context of the agreement:
"It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again."
So the key here is that Tesla autopilot is a driver assist but is positioned such that the driver needs to remain alert and able to assume control of the vehicle at any moment. So when it is said, "Neither Autopilot nor the driver noticed..." it is critical to note that ultimately the driver failed to control their vehicle. Personally, I find this reasonable.
[+] [-] dogma1138|9 years ago|reply
Autopilot features are in a grey area now since they are very new but Tesla is still responsible regardless of what they claim to be.
Seatbelts and Airbags were also in "beta phase" at some point, and more likely than not some people got killed by a deploying airbag in it's early days or by a seatbelt with a shoddy release function before everything was standardized but you could still sue for compensation in those incidents as well.
[+] [-] netinstructions|9 years ago|reply
> What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.
Failing to detect the white side of a tractor against a brightly lit sky is something I can see camera/image based sensors struggling with, but not LIDAR based sensors.
[1] https://www.teslamotors.com/blog/tragic-loss
[+] [-] gshulegaard|9 years ago|reply
So, speculatively, it is a double failure:
1) Trailer was high enough to avoid radar
2) Trailer was light enough/low contrast to avoid detection by the camera
[+] [-] at-fates-hands|9 years ago|reply
What if a deer crosses the road? Will the car hit the deer to save the life of the driver? Avoid the deer to avoid an accident? Or save the deer and run into the ditch, possibly harming the driver?
Unfortunately, this technology is not capable of making those decisions and if ultimately the driver is responsible, then maybe we should just keep it that way then.
[+] [-] growt|9 years ago|reply
[+] [-] joezydeco|9 years ago|reply
[+] [-] GrinningFool|9 years ago|reply
[+] [-] jholloway7|9 years ago|reply
[+] [-] themgt|9 years ago|reply
[+] [-] michaeldwan|9 years ago|reply
[+] [-] nathanaldensr|9 years ago|reply
[+] [-] cylinder|9 years ago|reply
I've had doubts about autopilot after it was hastily released and I've seen some unnerving anecdotes on Tesla groups about it. I think Tesla needs to keep Elon in check a bit more...
[+] [-] henrikschroder|9 years ago|reply
[+] [-] roadnottaken|9 years ago|reply
[+] [-] burnguy123|9 years ago|reply
[+] [-] steve19|9 years ago|reply
If not, comparing auto pilot deaths with average deaths per mile is disingenuous.
[+] [-] karyon|9 years ago|reply
[+] [-] techthroway443|9 years ago|reply
[+] [-] United857|9 years ago|reply
[+] [-] ibrahima|9 years ago|reply
[+] [-] tlrobinson|9 years ago|reply
I'm happy to let you all beta test this one for me.
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] johansch|9 years ago|reply
[+] [-] icebraining|9 years ago|reply
[+] [-] wldcordeiro|9 years ago|reply
[+] [-] tfinniga|9 years ago|reply
Relevant section: In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi. The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash. The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A. When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence. The car smashed through two fences and struck a power pole. The car rotated counter-clockwise while sliding to its final resting place about 100 feet south of the highway. Brown died at the scene. Charges are pending.
Here is a google maps view of the accident location: https://goo.gl/maps/SSKyoxhoaxp
[+] [-] evan_|9 years ago|reply
> [...] a tractor trailer drove across the highway perpendicular to the Model S.
https://www.teslamotors.com/blog/tragic-loss
[+] [-] dmoy|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] Jemmeh|9 years ago|reply
I take that as the truck was crossing the street (from a side street) in front of the car, but because of the sun blinding the sensors and the driver, the car didn't stop.