top | item 44625262

(no title)

Coeur | 7 months ago

What that mode shows is that the Tesla thinks there is nothing there.

AFAIK what Tesla's software does is object detection (cars, people, bikes, road sides) and only uses that. Which of course is not at all safe since there can always be things on the road that are an obstacle but not recognized.

What it should do is model the entire world around it in 3d and consider that. But it doesn't. Cow on the road - problably not recognized. A child in an uncommon halloween costume chilling / lying in the middle of the road - pretty damn sure not recognized and the Tesla would just kill that child. Yep.

(I drove a Tesla Model 3 with the latest "self-driving" software for a bit a year back.)

discuss

order

rogerrogerr|7 months ago

> AFAIK what Tesla's software does is object detection (cars, people, bikes, road sides) and only uses that. Which of course is not at all safe since there can always be things on the road that are an obstacle but not recognized.

This is not true; in modern FSD the visualization is disconnected from the actual driving model. The visualization runs a lightweight model that labels some stuff and shows it; the actual heavy lifting is now a single model that takes pixels as input and outputs control commands. My car very clearly reacts to stuff that doesn’t appear in the visualization.

Mountain_Skies|7 months ago

A while back electroboom did a test with his Tesla's self driving and in most cases had to slam on the brakes himself when it didn't recognize obstacles he had set up for it. Also have seen videos of Tesla's navigation screen having a difficult time with freight trains. It stopped for the trains but was very confused about what it was, showing it as various oddly shaped cars and trucks that often morphed into other oddly stretched vehicles.

maxlin|7 months ago

I've seen it stop in front of squirrels. Your simplification is simply untrue.

If you think "it is not at all safe" you should probably not go outside with your caution level. FSD miles are ~11x safer than human drivers mind you.

dylan604|7 months ago

The sad thing is that based on your telling of it, if it detects a small child in a weird costume in the middle of the road but doesn't recognize it as a small child that it's not at least recognizing that something is in the road that should not be regardless of what it is. That should be the minimum in recognizing the road and then recognizing any object within the road.

toast0|7 months ago

This approach is all backwards anyway. You don't need to detect obstacles per se, you need to detect safe conditions to proceed.

Clear roadway is safe. Following behind a detected vehicle is conditionally safe.

Cow in the road? Not clear road way, not a detected vehicle. Does not match any safe conditions, so do not proceed.

This approach is why you see so many reports of Waymos stopped somewhere that's outside traffic rules. But it's so much better to stop safely in situations where it's not really needed than to not stop in situations where it is needed.