(no title)
cabalamat | 1 year ago
So if an AI can't change its weights after deployment, it's not really an AI? That doesn't make sense.
As for the other criteria, they're so vague I think a thermostat might apply.
cabalamat | 1 year ago
So if an AI can't change its weights after deployment, it's not really an AI? That doesn't make sense.
As for the other criteria, they're so vague I think a thermostat might apply.
stubish|1 year ago
A learning thermostat would apply, say one that uses historical records to predict changes in temperature and preemptively adjusts. And it would be low risk and unregulated in most cases. But attach to a self-heating crib or premature baby incubator and that would jump to high risk and you might have to prove it is safe.
butlike|1 year ago
sofixa|1 year ago
As long as the thermostat doesn't control people's lives, that's fine.
logifail|1 year ago
Quite.
One wonders if the people who came up with this have any actual understanding of the technology they're attempting to regulate.
zelphirkalt|1 year ago