(no title)
LittleTimothy | 1 year ago
So what we'll probably end up with is real self driving Waymos all over the place and fake self-driving Teslas that 'self-drive' as long as you're still really driving.
The only real concern I have is whether Musk exploits his position to impact the regulations to push both Waymo and Tesla into a bucket called "self-driving" where they get categorized the same and both still require drivers, essentially using the regulations to knee cap any rival that is ahead of Tesla.
The other side of it is that I think we'd all be very happy if Musk went back to just lying about his electric vehicles.
jqpabc123|1 year ago
Personally, I don't think so.
There are simply too many driving situations where visual input is severely limited. FSD can use all the help it can get.
jfengel|1 year ago
I don't know if machine learning can ever match the human brain for that. The brain does a lot of fairly advanced inferences that require a deep understanding of the world and the people and things in it.
Still, I'm not sure how much additional inputs would help the ML. If you had to drive by "touch" (LIDAR), you probably shouldn't be allowed to drive. It might be useful when the visual system has failed, to stop the vehicle before it hits something, but if the visual system fails that often then the system wouldn't be usable for any purpose.
LittleTimothy|1 year ago