top | item 11778726

Tesla Model S adaptive cruise control crashes into van

77 points| aresant | 9 years ago |youtube.com

105 comments

order
[+] simonsarris|9 years ago|reply
This is a known issue. Bottom right of page 64:

https://www.teslamotors.com/sites/default/files/Model-S-Owne...

> Warning: Traffic-aware cruise control may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on TrafficAware Cruise Control to avoid a collision can result in serious injury or death.

[+] codemac|9 years ago|reply
Good to hear that Tesla knows it's an issue.

Bad to hear that they released it knowing this was an issue, because I wouldn't expect myself to be able to react to this.

Humans are not capable of handling these types of nuances well at speed. Being an active driver and being an inactive driver that needs to stay alert enough to catch the few cases where you run into "known issues" is just not going to work well.

As I'm typing it I'm suddenly having a wave of guilt for all kinds of software bugs I've written where the answer was "it's a known issue".

[+] ucaetano|9 years ago|reply
> Depending on TrafficAware Cruise Control to avoid a collision can result in serious injury or death.

What is the sense of it then? If you can't depend on it, you need to keep alert constantly, with your hands on the wheel and foot ready to hit the brakes.

Which essentially defeats the purpose of TrafficAware Cruise Control.

[Edit: a user was kind to point that Tesla's Autopilot ISN'T an autopilot, and should NOT be relied upon to avoid accidents, requiring indeed full attention when driving]

[+] NamTaf|9 years ago|reply
If anyone thinks that such a warning automatically absolves Tesla of responsbility for ensuring this sort of stuff doesn't happen then they're grossly mistaken. My big rant 2 weeks ago [1] covered exactly this. Real-world design means that you don't just slap a disclaimer or warning sticker on and expect that to be enough. You must, as the designer, anticipate misuse of your design that is encouraged by proper use of it and work to eliminate that capacity of misuse.

In the same way that modern web server systems try to sanitise database inputs, etc. rather than just blame the developer for not knowing best-practices themselves, real-world engineering is all about understanding how the system may be misused and designing to eliminate the capacity for misuse.

Yes, you'll never eliminate misuse completely, but the court of law is littered with examples that didn't bother to try and so couldn't defend against a claim of negligence in their design engineering process.

Simply saying in a manual that 'hey, this may cause you to crash in these circumstances - pay attention!' isn't enough.

[1]: https://news.ycombinator.com/item?id=11680576

[+] jessriedel|9 years ago|reply
Is there a technical explanation anywhere for why this is so hard for the system to handle? I get that if a car ahead of you suddenly swerves out of the way, revealing a stopped vehicle that was completely blocked from view previously, that the vehicle (like a human) can't react until it sees it. But it's clear the vehicle had time to react and didn't until the crash was imminent.
[+] FussyZeus|9 years ago|reply
I wouldn't even classify this as an issue, this is CRUISE CONTROL, not auto-pilot. It's meant to meter your speed not your heading, and the heading is what needed adjusting. Driver was clearly not paying attention or he would've followed the car the Tesla was likely tracking for speed.

I don't think the tech is at fault here at all, if you were using regular cruise control in this scenario nothing would've changed.

I mean if the CC slammed on the brakes every time it passed a bridge pylon or a stationary car THAT would be broken.

[+] jonasty|9 years ago|reply
if you have to stay aware of road conditions with a risk of serious injury/death for not doing so, what is the purpose of using an auto-pilot feature?

I suppose you could go hands free because in the event of a road hazard you might just have to brake (although coming to a stop on a highway is dangerous - the person behind you might not stop).

For me the value of an auto-pilot feature is the gain of time. One can respond to emails, read, field a call w/out risking the safety of themselves or others, etc.

[+] Alupis|9 years ago|reply
What we have here, is a car that behaves like a T-Rex. It can see and react to moving obstacles, but as soon as something stands still, it loses all perception.

In all seriousness, how is this an issue at all?

It clearly sees obstacles (like a wall in a curve) and takes corrective action. How does it not do something (if even just applying the brakes) when there's an obvious obstruction right in front of it?

[+] abhi3|9 years ago|reply
I can't imagine the amount of anxiety it would cause drivers if they knew this. Rather just drive myself.

Its like the Microwave saying that most of the time it'll work fine but may blow up the house sometimes, please pay constant attention and stop immediately if you see smoke.

[+] outworlder|9 years ago|reply
This would be a bad situation even for a human driver.
[+] honkhonkpants|9 years ago|reply
There are a bunch of these videos on YouTube. The best part about them is watching the Tesla Owners' Club (Internet Comment Subdivision) crawl out of the woodwork to explain away why the software worked perfectly and if only the meatbag behind the wheel had read section 74.3(f) of the operating instructions they would OF COURSE have known about this.
[+] aggie|9 years ago|reply
While I would hope automakers don't push the technology before it's ready, and relying on disclaimers does seem more CYA than an actual safety measure, it's easy to be defensive when an issue like this has a high probability of being treated irrationally. A meatbag in a conventional vehicle getting into an accident doesn't make the news. An autonomous vehicle completing a trip without incident doesn't make the news. The relative accident rates of autonomous vehicles and meatbag-piloted vehicles doesn't make the news. These are sensitive times for public acceptance of the technology.
[+] imtringued|9 years ago|reply
The Tesla has a huge screen in the dash. Why not force the driver to read or listen to a summary of the most important sections before they engage cruise control?
[+] bargl|9 years ago|reply
In a continuation of regulation being outpaced by technology. I feel the current law doesn't properly cover this. My opinion is that, If a car manufacturer is providing a service that controls the brakes and acceleration or turning of your car in an autonomous manner they should be responsible for any accidents that arise because of it.

It is not responsible to expect humans that are sitting behind the wheel of a car that covers 90% of situations to pay attention for the 10%. This sort of situation puts the human at eases and gives them false confidence in a vehicle.

Even if the fine print SAYS they are in control. They are adding a layer between the human and the car that delays their reaction speed a little more. Because the human is not trained to drive in a buggy ass car, they aren't trained to respond to this sort of thing.

Liability for these sorts of accidents should fall on Tesla, and they should not be having their drives be guinea pigs. If a driver wants access to this autonomous mode they should have to sign a release because by default Tesla should be responsible. Again the law isn't current with this.

I also want to say I love what tesla is doing but I think the bad PR of irresponsibly integrating autonomous driving when it isn't ready (or they don't have the proper sensor bank to make it work) is going to delay the acceptance of self driving cars in the public in general.

Tesla's self driving car is to Googles self driving car as the Blue Origin landing was to Space X's landing . But in general people don't know or see the difference they are just cars that drive themselves or rockets that land. Also google's car is typically driven with a trained driver.

I also want to say I'm a huge proponent of self driving cars.

[+] FussyZeus|9 years ago|reply
The technology in question is cruise control, not auto pilot or auto steer or anything of the sort. Mercedes' have had radar enabled cruise control since the 80's, and the principle is the same: The vehicle will keep pace with traffic and if you watch the video, that's exactly what it did. The driver maintains full control of the vehicle and it's clear by the sound of the conversation that the driver wasn't paying attention to the road, ergo, driver's fault 100%.
[+] cjensen|9 years ago|reply
I'm not an owner, but even I know that the Adaptive Cruise control is for use on clear highways and can't cope with much.

This was a situation where the driver plainly needed to take over, had plenty of time to take over, and the issue was sufficiently obvious that even a non-alert driver who was looking forward would see they needed to take over.

There's an argument to be made that relying on partial automation is a fundamentally bad idea because the driver will not become sufficiently alert in time to save themselves. This is not an example of such a situation.

[+] soccerdave|9 years ago|reply
As the driver said in his comments, when he'd seen the system stop 1,000 times in the past he had a false sense of security that it was actually going to stop for him like it was supposed to.
[+] Aelinsaar|9 years ago|reply
It's still a great track record, but I can't shake my misgivings about automotive autopilot beta testing being an open-road thing. This seems like begging for trouble, with the biggest issue being the occasional ambiguity about a momentary brake-tap turning a system off.
[+] holyoly|9 years ago|reply
Adaptive cruise control is not autopilot. It does not steer for you. It's not in beta, it's been a production feature on high end cars for 10 or so years.

Regular cruise control has always deactivated with a momentary brake tap. Been this way since cruise control started coming on cars.

[+] heptathorp|9 years ago|reply
I cannot wait to be killed by some idiot putting 100% faith in his Tesla to do his job of driving for him.
[+] pavlov|9 years ago|reply
If you're worried about getting killed by other drivers, you should simply avoid driving a car. Hundreds of thousands of people die that way every year, and 0% of those cases have involved Teslas so far.
[+] toomuchtodo|9 years ago|reply
Would you prefer an 18 year old snapchatting at 107 MPH in a Mercedes with a pregnant passenger? Resulting in permanent brain damage for her crash victim?

https://www.washingtonpost.com/news/morning-mix/wp/2016/04/2...

The point is no matter how many accidents autopilot gets into, its safer than human drivers. And it will only improve.

Autopilot could kill ~30K people a year, and that's still less than the number of people killed by people driving in the United States alone.

[+] nahual|9 years ago|reply
Probably less likely than being killed by an idiot putting 100% faith in their driving ability.
[+] ntrepid8|9 years ago|reply
Mine picks up the stationary vehicles 9 out of 10 times, but you do have to watch it. This happens most frequently at stop-lights on roads where the speed limit is above 55 MPH and I'm approaching the stationary car stopped at the light.
[+] SFJulie|9 years ago|reply
Too much assistance make people less aware of danger especially when there are slight perturbations to the handled case:

It happens in airplane industry (where people forgot some basics because they use to much autopilots)

It happens in science when people use too much black magic statistic metadata (and forget to check the details in the slight differences of inputs).

But be aware that more automation of dangerous industry also will have the same effect ; mass transport, nuclear plant, chemical plants, medical practice, energy grids.

Well, aren't coding with google&SO and/or frameworks kind of a growing autopilot for most coders?

[+] bryanlarsen|9 years ago|reply
Read the YouTube comments -- the OP has responded to many questions there.
[+] whamlastxmas|9 years ago|reply
More accurate title: Tesla successfully emergency brakes to gently tap the rear end of a van instead of ramming it in what would be a potentially fatal accident
[+] purpleidea|9 years ago|reply
Who's got a mirror? Video is private.
[+] ethbro|9 years ago|reply
From the sounds, it seemed to detect it at the last moment. Can Model S owners confirm?

Also, there are variable settings for breaking, right? Has anyone tested what happens if they're intentionally set too low to prevent accidents?

So many experiments I'd run with a Model S and an EM mocked up crash pad...

[+] donald123|9 years ago|reply
apparently tesla drivers are just beta testers
[+] chipperyman573|9 years ago|reply
They made that pretty clear when they released a beta version of the software.
[+] sschueller|9 years ago|reply
The car appears to accelerate as soon as the car in front drivers around the van. I would think it would wait a second to reevaluate the situation before rapidly accelerating into the van.
[+] arprocter|9 years ago|reply
I noticed a recent Merc TV ad had a disclaimer along the lines of '*vehicle may not automatically brake in all circumstances'
[+] revelation|9 years ago|reply
Cruise control in the left lane? Not paying attention? Ticks all the boxes.
[+] dsfyu404ed|9 years ago|reply
Cruise control that will AUTOMATICALLY DETECT AND FOLLOW THE SPEED LIMIT(!!!) in the LEFT(!!!) lane.

HR would have a field day with all this box checking

[+] iamleppert|9 years ago|reply
Wow, what kind of awful sensor technology and low sampling rate are they using for this? You could probably build something that worked better with an Arduino.

Vehicles equipped with these features are actually more dangerous, based upon the preliminary data. I'm all for new technology, but if you want to use clearly "beta" -- and that's putting it lightly, as its clear there are definite hardware/sensor/sampling problems here, not software -- you should be required to carry high risk insurance.

It's not fair to the rest of us drivers who have to be on the road. We did not agree to be Tesla's beta testers too, and its only a matter of time before someone is seriously injured or killed by one of these "features".

I'm sure someone here will chime in and say, "but its safer than a human driver, a reckless teenager". To that I say, that's why teenager's insurance rates are so high! We need to increase the insurance rates here to reflect the risk presented to both the driver and general public of this reckless, half-backed autonomy.

[+] firethief|9 years ago|reply
> You could probably build something that worked better with an Arduino.

What are you waiting for?