(no title)
Coeur | 7 months ago
AFAIK what Tesla's software does is object detection (cars, people, bikes, road sides) and only uses that. Which of course is not at all safe since there can always be things on the road that are an obstacle but not recognized.
What it should do is model the entire world around it in 3d and consider that. But it doesn't. Cow on the road - problably not recognized. A child in an uncommon halloween costume chilling / lying in the middle of the road - pretty damn sure not recognized and the Tesla would just kill that child. Yep.
(I drove a Tesla Model 3 with the latest "self-driving" software for a bit a year back.)
rogerrogerr|7 months ago
This is not true; in modern FSD the visualization is disconnected from the actual driving model. The visualization runs a lightweight model that labels some stuff and shows it; the actual heavy lifting is now a single model that takes pixels as input and outputs control commands. My car very clearly reacts to stuff that doesn’t appear in the visualization.
Mountain_Skies|7 months ago
maxlin|7 months ago
If you think "it is not at all safe" you should probably not go outside with your caution level. FSD miles are ~11x safer than human drivers mind you.
dylan604|7 months ago
toast0|7 months ago
Clear roadway is safe. Following behind a detected vehicle is conditionally safe.
Cow in the road? Not clear road way, not a detected vehicle. Does not match any safe conditions, so do not proceed.
This approach is why you see so many reports of Waymos stopped somewhere that's outside traffic rules. But it's so much better to stop safely in situations where it's not really needed than to not stop in situations where it is needed.