Humans also don’t have 8 eyes facing every direction at all times. They also get drunk/tired/impatient/angry etc. The reality is the entire argument is silly. Both are very different and Musk/Karpathy argument is misrepresented here. Saying humans only use vision was a response to “its not possible with only vision” not a statement that human vision is good enough and no need to do better. The 8 camera surround is leaps better than human vision. Where they lack is processing the signal. Human brain does that better. But if you have better inputs (we do already) and you believe you can one day match on the processing part, you’ll one day get a much better result. One thats suited to the vision based roads we have now and scales to literally anywhere not geo constrained like Waymo
mola|2 years ago
logifail|2 years ago
Indeed, but humans also have an incentive to drive well, embodied by local traffic police and local laws, and even before passing their driving test they're made aware of the penalties for not driving well (which, let's remind ourselves, range from "mild ticking off"/"pay $$$" through "forfeit driving licence for a time" all the way to "forfeit liberty for a time")
Where are these incentives for self-driving algorithms?
If your algo breaks the law to a sufficient level, is someone(something?) prevented from driving for a time? Is that really going to be just that one vehicle, or should it be all vehicles with that same algo? If something really bad happens, who is charged; in the worst case, who might end up going to jail?
We all know CEOs tend to believe "this time it's different", that they're special, and that the annoying rulebook is to be viewed as guidance at best. VW/Martin Winterkorn, anyone?
ben_w|2 years ago
Surely the equivalent is the reward during training?
> If your algo breaks the law to a sufficient level, is someone(something?) prevented from driving for a time? Is that really going to be just that one vehicle, or should it be all vehicles with that same algo? If something really bad happens, who is charged; in the worst case, who might end up going to jail?
Personal opinion:
Algorithm should learn from fleet and should be shared by fleet; therefore all accidents should be treated like aircraft crashes and investigated extremely thoroughly with a goal of eliminating root cause.
If that cause was CEO demanding corners be cut to boost shareholder value then jail them; if it's that the algorithm had, say, never seen a flying shark drone[0] before, and misclassified it as a something it needed to take evasive manoeuvres to avoid and that led to a crash, then perhaps not (except anything I suggest probably should be in their list of things to check for, so even then perhaps it would still be a CEO-at-fault example…)
[0] https://www.amazon.com/RiToEasysports-Control-Inflated-Infla...
sudosysgen|2 years ago