Isn't that why lots of of jurisdictions have constraints on the learning driver (e.g. graduated licensing of some sort) and/or visibility requirements (e.g. car has to have a "learner" sticker of some sort) so that other drivers know?
We should require a test of one's ability to drive based on some basic standards before issuing a license. We should come up with a series of rules for what happens if someone does not adhere to these standards, as well as a mechanism of enforcement if they violate those rules.
They probably are to some extent, and you know where the liability would lie if they are responsible should something bad happen. What about this case ? There is no sense of responsibility or realization of the danger they are introducing at scale .
Even when they actually admit that they have failed at it [0]. I am not sure if they are aware of the doublespeak in this admission.
[0] Failure to realize a long-term aspirational goal is not fraud.
ska|3 years ago
maximus-decimus|3 years ago
d23|3 years ago
sifar|3 years ago
Even when they actually admit that they have failed at it [0]. I am not sure if they are aware of the doublespeak in this admission.
[0] Failure to realize a long-term aspirational goal is not fraud.
philjohn|3 years ago
I suppose everyone should just assume a Tesla is about to do something silly and drive defensively ...
unknown|3 years ago
[deleted]
witheld|3 years ago
Normally, each person puts one new driver on the road per-lifetime.
When you beta test a baby driving robot, you’re now at two new drivers per life! And the Tesla doesn’t seem to be learning faster than a human!