(no title)
thefounder | 11 days ago
You get used with the system to work correctly and then when you expect less it does the unthinkable and the whole world blames you for not supervising a beta software product on the road on day 300 with the same rigour you did on day one.
I can see a very direct correlation with LLM systems. Claude has been working great for me until one day when it git reset the entire repo and I’ve lost two days work because it couldn’t revert a file it corrupted . This happened because I just supervised it just like you would supervise a FSD car with “bypass” mode. Fortunately it didn’t kill anyone , just two days of work lost. If there was the risk of someone being killed I would never allow a bypass /fsd/supervise mode regardless of how unlikely this is to happen.
maxdo|11 days ago
Teslas has sensors , eye trackers etc is it possible to shoot yourself in the leg, sure. But not in any different way vs human doing irrational things in the car, make up, arguing , love etc.
Human-being is an irrational create that should not drive except for fun in isolated environment. Tesla or Waymo or anyone else.... It is good to remove human from the road, the faster the better.
thefounder|11 days ago
I’m all for this but not to replace dumb people with dumb software. I think the FSD should be treated more like the airplane safety. We have the opportunity to do this right not just what’s the cheapest way we can get away with it.