(no title)
gortok | 1 month ago
Folks in this thread are trying to compare Waymo to human driving as some sort of expectation setting threshold. If humans can’t be perfect why should we expect machines to be?
We don’t expect humans to be perfect. When a human breaks the law we punish them. When they are sued civilly and found liable, we take their money/property.
There’s also a sense of self-preservation that guides human decision making that doesn’t guide computers.
Until we account for the agency that comes along with accountability, and the self-preservation mechanisms that keep humans from driving someone else onto a light rail track, we are making a false equivalence in saying that somehow we can’t expect machines to be as good as humans. We should expect exactly that if we’re giving them human agency but not human accountability, and while they still lack the sense of preservation of self or others.
shputil|1 month ago
I, as a somewhat normal driver, am not personally at much risk if some other driver decides to drive on the rails. That won't be true if I'm in a Waymo and there's nothing I can do about its bugs.
And I don't blame people who are skeptical that Waymo will be properly punished. In fact, do you suppose they were punished here?
xnx|1 month ago
Some truth to this, but a machine failure can be patched for all cars. There's no effective way to patch a problem with all human drivers.
bicepjai|1 month ago