top | item 22329480

(no title)

heynk | 6 years ago

When I was starting my engineering career, it was right around when the first DARPA challenges had started. The hype was beginning, and my optimism towards technology was strong. I thought the predictions and timelines would be correct, and I still feel strongly that self-driving will be safer than humans in the long term.

Recently, I bought a newer Subaru, with EyeSight. It has adaptive cruise and lane keep assist. The LKA is fine - it'll beep if you sway outside of a lane, and automatically adjusts the steering, but it won't keep you centered. It's more of a safety thing, and it works well from that perspective.

The adaptive cruise is really good. It's camera based, and I have had zero problems with it. It works well at night and in pouring rain. It'll even stay pretty close to the car ahead of you if you turn the "tolerance" all the way down. I'm always impressed.

Since I've had this car, I've thought a lot more about the practical implementation details of actual self-driving. I more often notice situations when driving that are seriously complex.

The more I think about it while I'm driving, the more I realize how fucking hard self-driving would be.

discuss

order

slavik81|6 years ago

I tried out a relative's Subaru on a 6h drive over the holidays. I really liked the adaptive cruise control for following behind folks who were not keeping a consistent speed. I just set it to a reasonable value at the maximum following distance and stopped worrying about my speedometer.

However, at one point the guy in front of me turned off onto a small side road. It was at night, and I don't think the car realized he had moved into a turn-off lane. It slammed on the brakes. I probably went from 90kph to 40kph before it realized I was not going to hit that car.

I completely failed to react to the situation. I was worried my erratic braking would cause an accident behind me, but in the moment, I didn't know how to stop it. That was not a type of emergency I had considered or prepared for.

bdamm|6 years ago

Yeah this is interesting. My Tesla M3 does similar behavior, and so I often am ready to punch the accelerator. In the Tesla, this is how you solve that problem. The driver's push on the accelerator contraindicates the AI's decision to slow down, and so the car follows the driver's direction.

Where it gets dicey is the scenario where the "imminent collision" (hazards on, seatbelts tightened) detection is triggered, and the driver continues to push hard on the accelerator. Tesla has a fairly lengthy statement in the manual about this scenario. The bottom line is there are all kinds of heuristics at play that may or may not result in an override depending on the specific sequence of events.

wffurr|6 years ago

Same thing, first time I used the Subaru EyeSight system. Now I know to pay attention for that particular failure mode, and override with the gas pedal and a little steering.

Definitely surprised the heck out of me though the first time the car slowed way down on the interstate because the car ahead of me pulled onto the off-ramp.

Shivetya|6 years ago

I see the current Tesla system in similar light, it does very well in every day common situations and some of what it does is damn good; driving down country roads with curves and such is exhilarating but still safe.

Now currently the Tesla system does give you a much more clear idea of what the car does see around it but still no option to see what it fully records; there are means to get this footage but its not something every driver can.

Now my TM3 goes in at the end of the month for the hardware 3.0 upgrade which will allow it to process more of what it sees and also relay that to me. The difference in what other's have show in just what the car relays back to the driver exposes just how much information has to be processed.

Then comes the simple fact, the real issue is that the hard decisions are ones we make all day, driving by exception. We make so many choices that are exceptions to the rule we are numb to it, it is nearly subconscious.

Then the other issue, other drivers. Not just people who drive badly but those who will go out of their way to cause self driving cars issues. with the number of people on the road you will find them with too much regularity. More might pop up if regulation comes down which demands self driving cars or semi autonomous cars obey all traffic laws, especially speed limits. On some roads I drive just obeying the limit is enough to impart rage on other drivers.

ghaff|6 years ago

I'm inclined that improving systems up to full autonomy for many highways in many weather conditions is a fairly realistic maybe 5-10 year plan. Which would actually be pretty nice and potentially a big win for safety.

The problem with widespread L4/5 is that you need to get to a car that can literally drive itself between 2 points on a map with a high degree of reliability, in a wide range of weather conditions, on roads of varying conditions, with unexpected/unmapped obstacles that may require doing something technically illegal to get around, without human help. And that, as you say, seems really hard.

skywhopper|6 years ago

Ultimately, the right place for most carmakers to focus on at this point would be situational awareness and safety features, gradually improving the situations where the car can prevent a crash.

Put it this way: If a driverless car would be safer than human drivers, then that would imply that all the necessary technology would already exist to allow humans to be the driver while the car still keeps them out of deadly situations. If such tech is not possible to develop, then it seems unlikely that true driverless tech (which would need to combine that safety tech with a lot of other technology) will happen.

rootusrootus|6 years ago

Earlier today I was driving on the highway with autopilot (I have a Model 3) and came to a section where the road is angled in such a way and the pavement is old enough that there is a fair amount of standing water. Driving manually, I steer to the right or left slightly to avoid the ruts filled with an inch or two of standing water. Autopilot, on the other hand, was perfectly happy to blast right through it.

That's the kind of weird edge case that makes me think we're farther from real self-driving than most people want to admit. I'd be hard pressed to define exactly how I'd tell the computer to avoid that. Maybe the answer is that it can't deal with that until it results in hydroplaning, and then it reacts however it can.

catalogia|6 years ago

That's a particularly insidious circumstance since standing water can conceal hazards from any vision system these cars or humans have. I would expect self-driving cars to refuse to drive through water in any circumstance. There could be a large pothole in the puddle that would ruin your car.

Worse than a mere car-destroying pothole, what if the flooded portion of the road no longer existed at all? That's a common enough occurrence that student drivers are generally warned about it specifically, warned to never drive across flooded sections of roadways because your car might fall into 10 feet of water without warning. If a self-driving car doesn't avoid a scenario we teach teenagers to be wary of, I don't think it deserves to be called self-driving.

Johnny555|6 years ago

You have local information that cars don't yet know (but they probably will someday -- cars can send detailed road conditions to a central database, or they can communicate with other cars, so the car in front of you can say "watch out, there's a big pothole 8 inches from the left lane line" and your car will try to avoid it.

What do human drivers do there when they are unfamiliar with the road? Seems like the auto pilot should be able to do at least no worse than human drivers.

learc83|6 years ago

>Recently, I bought a newer Subaru, with EyeSight. It has adaptive cruise and lane keep assist. The LKA is fine - it'll beep if you sway outside of a lane, and automatically adjusts the steering, but it won't keep you centered. It's more of a safety thing, and it works well from that perspective.

I have 2020 Subaru and it has lane centering on top of that. On the highway, with clear lane markings it comes very close to driving itself. It won't slow down to handle curves on its on though.

close04|6 years ago

Today saying cars have "self driving capabilities" is like saying you're fluent in 3 words of a language. They have advanced driver assists but the insistence on the "self driving" terminology tricks enthusiasts and less tech savvy people alike into a false sense of confidence in the tech. Sometimes all the way to their deaths.

OnlineGladiator|6 years ago

> The adaptive cruise is really good. It's camera based, and I have had zero problems with it. It works well at night and in pouring rain. It'll even stay pretty close to the car ahead of you if you turn the "tolerance" all the way down. I'm always impressed.

I checked, assuming it's actually using a radar - and you're right. They seem to use a stereo camera system. Neat.

https://www.subaru.com/engineering/eyesight.html

I'm surprised it works well in bad weather, but I've never tried it.

scotje|6 years ago

It's not impervious to bad weather, but pretty resilient. I'd say in the 2 years we've had ours the system has shut off maybe 3-4 times due to one of: a) very low and direct sun angle, b) very heavy rain, c) dense fog. Which, to be fair, are all difficult conditions for a human to drive in as well.

But I agree with the parent, the suite of driver assistance features is very good, but a long way from "self driving".

LoSboccacc|6 years ago

you haven't had a issue with it. multiple people died already.