> my daughter—as well as all of your children—could become just another anomaly in the path toward progress.
Wouldja think of the children! I'm astonished that this argument-enhancing cheat-code still flies in 2019. It should be as obvious as "these N-units of clickbait". If he's so concerned about his daughter, why did he take her out on a freeway where several lives are lost per year? Why does he allow his precious daughter to breathe the smog from millions of automobiles in the LA Basin? I mean, think of the children! breathing all that NOx into their little-children lungs! We need regulation!
The game is stacked against Autonomous tech since the detractors can point to any incident at which AT is at fault, but AT supporters can't point to the lives saved by safer decisions, faster reaction times, less emissions, zero emotion which serves to reduce stress and road rage incidents.
But since those gains are invisible, and people are emotional, hit pieces like this will always be popular.
It is well documented that Tesla regulary releases features which make wrong decisions and put drivers into dangerous situations. Last case happened literally 5 days ago and already well-documented by Tesla owners (and not some imaginary evil short-sellers):
It worth mentioning that this feature, called ELD, can't be disabled permanently and enables itself back again after each trip. Driver is expected to disable it manually each time. Also, this feature could kick in anytime, even when driver is actively steering!
So, maybe author was wrong when he mentioned his kid, but his point is essentially correct.
How is this behaviour OK on a public road -- "I made it half way into the shoulder and the emergency lane departure kicked in a yanked me back into the lane, almost hitting the car I was passing as I fought with the automatic steering"?
I am sorry but you are invoking a strawman by claiming the article does a "think of the children" argument. This is one specific child in one specific case of narrowly avoiding harm. The example is one clear case of the technology not being ready, not being likewhat you describe it.
> I'm astonished that this argument-enhancing cheat-code still flies in 2019.
I'm not. Preying on instincts is a classic tactic, one that will continue to work until the Allied Mastercomputer implements human extinction in a fit of machine-driven spite.
This is yet another hit piece by wallstreet short sellers. Autonomous vehicles may kill people occasionally, but far less frequently than human driven vehicles.
Probably a hit piece, but if we're to analyze the policy changes required here to mitigate/adjust for risk effectively (and which can be compelled by any insurance company with worthwhile risk adjusters):
- Insurance rate deltas for never-on v. ever-on "beta" automated driving features. This absorbs risk both for the insurance companies as well as on behalf of the driver.
- Visible and blatant notices by companies trialing self-driving features that such features may result in insurance rate adjustments if used as the features themselves pose not-yet-quantifiable risk.
- Mandatory reporting by car companies as to who enabled beta self driving features (accepted the risk) to send to insurance companies if needed e.g. in the event of a collision.
Don't know which of these would happen first, but frankly, insurance should be leading the way on appropriately derisking beta-testing self-driving features since it's... probably an inevitability that these will be tried by drivers on public roads.
Maybe not in the beta, as mentioned in the article where there may have been a risk of imminent collision.
But in the stable version? Yes it should. Your daughter is not more important than say the newborn twins that may be carried by another car that your car is about to hit.
Otherwise, you believe that her existance is worth sacrificing N people over - or hell, even a few million people, and that ain't right.
While I don't disagree with regulation of this tech, this is ridiculous take. Please stop acting like autopilot is more dangerous than all the other drivers on the road not paying attention.
If you really fear for her safety in a car, stop putting her in a car; or maybe put her in a Tesla because it's probably safer than the car you're driving.
I didn't sign up to be on the road with vehicles that can't control their speed, like Toyota's fatal "unintended acceleration" bug. Likewise I don't want to accept risk from drunk or distracted human drivers. But them's the brakes; Autopilot isn't dramatically different.
I'm interested in this stuff, but their "quarterly reports" are just ~3 sentence summaries.
Is this the extent of these reports? (maybe I'm missing something). It would be helpful to see a list of the actual incidents they're using to calculate these numbers.
Sure, autopilot does better than the average car on the road.
But there are parameters to control for. For example, the average car on the road is 10 years old, it's probably not very well maintained, and on average costs <$20k.
You should compare the safety of a Tesla to the safety of a similarly aged, similarly priced, and similarly equipped car, for example one with active safety mechanisms.
Plus, IMO the false confidence that Autopilot gives enables dangerous negligent behaviour from the drivers.
Want to see the safest car? NHTSA publishes an annual driver fatality report, and there are car models that have never had a fatal crash. See the 2009-2014 XC90 for example.
"In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver’s skills. The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws [...] As a result, the driver often had to prevent the system from making poor decisions."
he doesn't like contingency written down and measured, so he is advocating for hidden variables with randomized content: better keep the status quo and keep hitting with [drunk/sleepy/angry] people
"In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver’s skills. The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws [...] As a result, the driver often had to prevent the system from making poor decisions."
So his evidence the car was on autopilot is that the driver didn't have his hands on the wheel? Plenty of people do that in cars with no driver assist at all.
Fear is a powerful emotion and we shouldn't let it govern our lives. Statistics say that something bad will happen to someone at some point.
Unless Teshla's are causing an _increase_ in car collisions, I see no reason to be this upset about it.
The author's fear, _of something that didn't happen_, has gripped them. It would be one thing if his daughter were killed as a result of a collision with a Tesla, that was operating on AutoPilot, but it didn't even happen. This person shouldn't even by driving at all.
[+] [-] JudgeWapner|6 years ago|reply
Wouldja think of the children! I'm astonished that this argument-enhancing cheat-code still flies in 2019. It should be as obvious as "these N-units of clickbait". If he's so concerned about his daughter, why did he take her out on a freeway where several lives are lost per year? Why does he allow his precious daughter to breathe the smog from millions of automobiles in the LA Basin? I mean, think of the children! breathing all that NOx into their little-children lungs! We need regulation!
The game is stacked against Autonomous tech since the detractors can point to any incident at which AT is at fault, but AT supporters can't point to the lives saved by safer decisions, faster reaction times, less emissions, zero emotion which serves to reduce stress and road rage incidents.
But since those gains are invisible, and people are emotional, hit pieces like this will always be popular.
[+] [-] nopriorarrests|6 years ago|reply
1. https://np.reddit.com/r/teslamotors/comments/burimz/psa_emer...
2. https://np.reddit.com/r/teslamotors/comments/bulra4/emergenc...
3. https://teslamotorsclub.com/tmc/threads/emergency-lane-depar...
It worth mentioning that this feature, called ELD, can't be disabled permanently and enables itself back again after each trip. Driver is expected to disable it manually each time. Also, this feature could kick in anytime, even when driver is actively steering!
So, maybe author was wrong when he mentioned his kid, but his point is essentially correct.
How is this behaviour OK on a public road -- "I made it half way into the shoulder and the emergency lane departure kicked in a yanked me back into the lane, almost hitting the car I was passing as I fought with the automatic steering"?
[+] [-] polotics|6 years ago|reply
[+] [-] snerbles|6 years ago|reply
I'm not. Preying on instincts is a classic tactic, one that will continue to work until the Allied Mastercomputer implements human extinction in a fit of machine-driven spite.
[+] [-] nsnick|6 years ago|reply
[+] [-] anextomp|6 years ago|reply
> Tesla Cost-Cutting Measures so Hardcore Employees Are Bringing Toilet Paper From Home
> Tesla's Wall Street Romance Is Over
> Florida Garage Fire Engulfs Tesla, Bentley, Rolls-Royce, Porsche, and More
> Tesla’s Navigate on Autopilot 'Raises Serious Safety Concerns,' Consumer Reports Says
> Tesla’s Walls Are Closing In As Musk Says Survival Requires ‘Hardcore’ Measures
[+] [-] buryat|6 years ago|reply
it's not production ready so it's a legit concern
[+] [-] qnsi|6 years ago|reply
[+] [-] eganist|6 years ago|reply
- Insurance rate deltas for never-on v. ever-on "beta" automated driving features. This absorbs risk both for the insurance companies as well as on behalf of the driver.
- Visible and blatant notices by companies trialing self-driving features that such features may result in insurance rate adjustments if used as the features themselves pose not-yet-quantifiable risk.
- Mandatory reporting by car companies as to who enabled beta self driving features (accepted the risk) to send to insurance companies if needed e.g. in the event of a collision.
Don't know which of these would happen first, but frankly, insurance should be leading the way on appropriately derisking beta-testing self-driving features since it's... probably an inevitability that these will be tried by drivers on public roads.
[+] [-] devereaux|6 years ago|reply
But in the stable version? Yes it should. Your daughter is not more important than say the newborn twins that may be carried by another car that your car is about to hit.
Otherwise, you believe that her existance is worth sacrificing N people over - or hell, even a few million people, and that ain't right.
[+] [-] kod|6 years ago|reply
[+] [-] vkhn|6 years ago|reply
If you really fear for her safety in a car, stop putting her in a car; or maybe put her in a Tesla because it's probably safer than the car you're driving.
[+] [-] mLuby|6 years ago|reply
[+] [-] andrewtbham|6 years ago|reply
https://www.tesla.com/VehicleSafetyReport
https://hcai.mit.edu/human-side-of-tesla-autopilot/
[+] [-] duxup|6 years ago|reply
[+] [-] newman8r|6 years ago|reply
Is this the extent of these reports? (maybe I'm missing something). It would be helpful to see a list of the actual incidents they're using to calculate these numbers.
[+] [-] mh8h|6 years ago|reply
You should compare the safety of a Tesla to the safety of a similarly aged, similarly priced, and similarly equipped car, for example one with active safety mechanisms.
Plus, IMO the false confidence that Autopilot gives enables dangerous negligent behaviour from the drivers.
Want to see the safest car? NHTSA publishes an annual driver fatality report, and there are car models that have never had a fatal crash. See the 2009-2014 XC90 for example.
[+] [-] _Codemonkeyism|6 years ago|reply
[+] [-] ribosometronome|6 years ago|reply
[+] [-] _Codemonkeyism|6 years ago|reply
"In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver’s skills. The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws [...] As a result, the driver often had to prevent the system from making poor decisions."
https://www.consumerreports.org/autonomous-driving/tesla-nav...
[+] [-] creaghpatr|6 years ago|reply
No shit, that would be the job of the parent.
See: Pedophastry
https://medium.com/incerto/pedophrasty-bigoteering-and-other...
[+] [-] duxup|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] mromanuk|6 years ago|reply
[+] [-] AimForTheBushes|6 years ago|reply
391 Ohio Traffic Deaths YTD, all of which human caused.
[+] [-] _Codemonkeyism|6 years ago|reply
Consumer Reports says (May 2019)
"In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver’s skills. The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws [...] As a result, the driver often had to prevent the system from making poor decisions."
https://www.consumerreports.org/autonomous-driving/tesla-nav...
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] Overtonwindow|6 years ago|reply
[+] [-] Causality1|6 years ago|reply
[+] [-] JungleGymSam|6 years ago|reply
Unless Teshla's are causing an _increase_ in car collisions, I see no reason to be this upset about it.
The author's fear, _of something that didn't happen_, has gripped them. It would be one thing if his daughter were killed as a result of a collision with a Tesla, that was operating on AutoPilot, but it didn't even happen. This person shouldn't even by driving at all.