"our data shows that, when used properly by an attentive
driver who is prepared to take control at all times,
drivers supported by Autopilot are safer than those
operating without assistance"
This is nonsense. The whole core of the argument against this sort of system is that it induces inattention. The fact that Tesla consistently refuses to share full data on identical models driving with autopilot turned on, enabled but not turned on, and not enabled is extremely suggestive that the overall safety rates for autopilot on are not better.
Still we as a society usually decide not ban things that hurt only irresponsible people. There is an exception to this for children, but they are banned from driving already.
If the autopilot crashes start resulting in a number of deaths outside of the drivers car that might shift the public opinion.
Beats me why hasnt Tesla auto pilot been shut down already. If the car is on Autopilot then people will get used to it and use that time to check on mail, call a friend or simply doze off. Some of these guys will end up getting killed and 'we told you so' simply doesnt cut it.
EDIT: The accident rate is a statistic thrown by Tesla and only serves the purpose its meant to serve.
Do we have any statistics on accidents caused comparing the situation when car is on auto-pilot vs similar cars driven by similar drivers under similar conditions without autopilot ?
Some of these guys will end up getting killed and 'we told you so' simply doesnt cut it.
"I told you so" cuts it for tens of thousands of deaths caused by user error per year.
I'm not saying you're wrong about Tesla's autopilot, but it should be put in proper context. Tens of millions of lightly trained people are controlling thousand-pound steel projectiles at speeds where stopping distances are measured in hundreds of feet. Gruesome results are par for the course. I know we're all hoping that these deep-pocketed companies with fancy tech and big research budgets are going to fix it all for us, but they're not going to solve it on their first go.
Have you driven one?
I find I am far more alert and less with less cognitive fatigue when it is on autopilot.
People may misuse the technology, but they text and drive a car without it. (and have far more accidents). It is far far better. Try it.
Here's a recap of the pertinent NTSB's conclusions from the last time this happened. How many will apply again this time?
“There was sufficient sight distance to afford time for either the truck driver or the car driver to have acted to prevent the crash.”
“The Tesla’s automated vehicle control system was not designed to, and did not, identify the truck crossing the car’s path or recognize the impending crash; consequently, the Autopilot system did not reduce the car’s velocity, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.”
“If automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains.”
“The Tesla driver was not attentive to the driving task, but investigators could not determine from the available evidence the reason for his inattention.”
Doesn't appear to differ from previous Tesla autopilot accidents where the vehicle couldn't detect a tractor trailer crossing the path of travel with cameras or front facing radar. Interesting that Autopilot had only been activated for 10 seconds prior to the incident occurring.
I am not a domain expert. Front facing radar and cameras alone is clearly inadequate for forward facing object detection [1] (unless we're going to mandate radar detectable skirts and images superimposed on the sides of tractor trailers for vehicle detection). Would solid state LIDAR (essentially laser ranging I suppose, as you're not building a 360 degree point cloud) solve this particular edge case? Some automakers are including laser headlights in their vehicles [2] (although I don't believe this is yet approved for the US market); would it not make sense to convert the headlights into front laser ranging sensing systems along with their illumination function?
I also found the 10 second engagement time odd. On the one hand, you might not thing it's enough time to become seriously distracted. (eg, you wouldn't be "comfortable" with the semi-autonomous operation yet).
On the other hand, the engagement of autopilot might have been caused by a desire to explicitly NOT pay attention, however briefly. think: "ok, now let me read this text from the person I'm on my way to meet."
I have found myself occasionally using autopilot to augment my driving when I'm doing something that inherently decreases focus on the road, however briefly; eg, taking my sunglasses out of the glovebox. I still drive the same way I normally would in any other car when reaching for something in the glovebox, so it's easy to think "any enhanced safety is better than none".
On the other hand, it could lead to a false sense of security where you pay 10% less attention than doing the same task in a non-augmented car. Or you get comfortable doing such things, so you do them more frequently than truly necessary. (for example, you might just shield your eyes from the sun with your hand rather than grabbing sunglasses in a non-augmented car).
I don't think we can conclude that cameras are inadequate for forward object detection. Only that the processing systems used so far were. This can be a matter of training of the neural networks and the amount of processing power used in the computing units.
Radar is limited, as long as trailers in the US are allowed to have such high gaps to the ground. A lower bar on the sides would both help the collision detection systems as well as reduce the crash itself.
It seems like this is a case where the autopilot thought that the truck was a stationary object. Based on my reading of the press release, it seemed like the truck was turning across the car's path, which should look like a stationary "bridge" to the model and subsequently ignored.
This seems to be a hard problem where you don't want the model to false positive on a stationary object on the side of the road and slam on the brakes.
> Based on my reading of the press release, it seemed like the truck was turning across the car's path, which should look like a stationary "bridge" to the model and subsequently ignored.
So if I understand you properly, an adversarial attack against Tesla autopilots would be to suspend a ladder across the road at windshield height?
"Vulnerable to being clotheslined" seems like a bit of an oversight.
I am beginning to get frustrated with news surrounding both Tesla autopilot crashes, and self driving car crashes. I understand that we are seeing lots of news about it because it is new, and people are scared that self driving cars are going to kill people. But can you imagine if an article trended every time someone got into an accident while using cruise control?
This is about surrender responsibility to a piece of software and trust it 100% with your life and/or your family members lives or take responsibility and engage in driving the car. The news, and legitimately so, are covering the unusual - surrender your abilities, judgments and consequently and directly your life, to faulty saftware and hardware.
I have said this to my friend, so far all autopilot accidents resulted in Tesla drivers death, but there is a person out there, hopefully not a child, whose car Tesla will ram on autopilot and their death will put an end to this stupid advertising/naming with fine print excuse.
[+] [-] femto113|6 years ago|reply
This is nonsense. The whole core of the argument against this sort of system is that it induces inattention. The fact that Tesla consistently refuses to share full data on identical models driving with autopilot turned on, enabled but not turned on, and not enabled is extremely suggestive that the overall safety rates for autopilot on are not better.
(edited for formatting)
[+] [-] yourMadness|6 years ago|reply
Still we as a society usually decide not ban things that hurt only irresponsible people. There is an exception to this for children, but they are banned from driving already.
If the autopilot crashes start resulting in a number of deaths outside of the drivers car that might shift the public opinion.
[+] [-] agumonkey|6 years ago|reply
[+] [-] modi15|6 years ago|reply
EDIT: The accident rate is a statistic thrown by Tesla and only serves the purpose its meant to serve. Do we have any statistics on accidents caused comparing the situation when car is on auto-pilot vs similar cars driven by similar drivers under similar conditions without autopilot ?
[+] [-] dkarl|6 years ago|reply
"I told you so" cuts it for tens of thousands of deaths caused by user error per year.
I'm not saying you're wrong about Tesla's autopilot, but it should be put in proper context. Tens of millions of lightly trained people are controlling thousand-pound steel projectiles at speeds where stopping distances are measured in hundreds of feet. Gruesome results are par for the course. I know we're all hoping that these deep-pocketed companies with fancy tech and big research budgets are going to fix it all for us, but they're not going to solve it on their first go.
[+] [-] hwillis|6 years ago|reply
[+] [-] tristanb|6 years ago|reply
[+] [-] whttheuuu|6 years ago|reply
[+] [-] bdamm|6 years ago|reply
[+] [-] mfatica|6 years ago|reply
[+] [-] lawguy|6 years ago|reply
“There was sufficient sight distance to afford time for either the truck driver or the car driver to have acted to prevent the crash.”
“The Tesla’s automated vehicle control system was not designed to, and did not, identify the truck crossing the car’s path or recognize the impending crash; consequently, the Autopilot system did not reduce the car’s velocity, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.”
“If automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains.”
“The Tesla driver was not attentive to the driving task, but investigators could not determine from the available evidence the reason for his inattention.”
https://www.ntsb.gov/investigations/AccidentReports/Reports/...
[+] [-] toomuchtodo|6 years ago|reply
I am not a domain expert. Front facing radar and cameras alone is clearly inadequate for forward facing object detection [1] (unless we're going to mandate radar detectable skirts and images superimposed on the sides of tractor trailers for vehicle detection). Would solid state LIDAR (essentially laser ranging I suppose, as you're not building a 360 degree point cloud) solve this particular edge case? Some automakers are including laser headlights in their vehicles [2] (although I don't believe this is yet approved for the US market); would it not make sense to convert the headlights into front laser ranging sensing systems along with their illumination function?
[1] https://www.caranddriver.com/features/a24511826/safety-featu...
[2] https://www.osram.com/am/specials/trends-in-automotive-light...
[+] [-] rconti|6 years ago|reply
On the other hand, the engagement of autopilot might have been caused by a desire to explicitly NOT pay attention, however briefly. think: "ok, now let me read this text from the person I'm on my way to meet."
I have found myself occasionally using autopilot to augment my driving when I'm doing something that inherently decreases focus on the road, however briefly; eg, taking my sunglasses out of the glovebox. I still drive the same way I normally would in any other car when reaching for something in the glovebox, so it's easy to think "any enhanced safety is better than none".
On the other hand, it could lead to a false sense of security where you pay 10% less attention than doing the same task in a non-augmented car. Or you get comfortable doing such things, so you do them more frequently than truly necessary. (for example, you might just shield your eyes from the sun with your hand rather than grabbing sunglasses in a non-augmented car).
[+] [-] whttheuuu|6 years ago|reply
The issue isn't with the sensors, but rather the system reacting to sensor data.
[+] [-] _ph_|6 years ago|reply
[+] [-] localhost|6 years ago|reply
This seems to be a hard problem where you don't want the model to false positive on a stationary object on the side of the road and slam on the brakes.
[+] [-] Majromax|6 years ago|reply
So if I understand you properly, an adversarial attack against Tesla autopilots would be to suspend a ladder across the road at windshield height?
"Vulnerable to being clotheslined" seems like a bit of an oversight.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] frenchie4111|6 years ago|reply
[+] [-] jocker12|6 years ago|reply
[+] [-] djanogo|6 years ago|reply