top | item 36184336

(no title)

dojomouse | 2 years ago

> One reason is that there are some problems that a company will only encounter once it begins testing fully driverless operations. Waymo and Cruise’s problems with fire hoses and caution tape is a good example. A human driver would disengage FSD long before it got into situations like that, which means Tesla would be unlikely to have the training data necessary to train its cars to handle it properly.

The reasoning here seems flawed to me. I assume Tesla use periods when the human is driving for training data, so this is no constraint at all.

discuss

order

No comments yet.