top | item 35342799

(no title)

jbenjoseph | 2 years ago

>By recycling its own training material, AI will get less meaningful results over time.

I read this a lot, and it sounds intuitive on the surface. But I don't understand how it's justifiable. For example, all that exists in the Universe is a result of the application of very simple rules. It would make sense that information is not what is important towards intelligence and complexity, but computation. It ought to be possible to create a superintelligence with a few bytes of training data, given enough compute.

discuss

order

yencabulator|2 years ago

Current approaches to ML are largely in the camp of throw enough "found data" at it and hope for the best. The exceptions are mostly games, like AlphaGo, where the ML can play against itself.

For generative AI, the hallucinations will poison the well. And they're not random, so same/similar hallucinations will pop up all over and reinforce each other.

chongli|2 years ago

Computers are electrical automatons that process binary signals and little else. Without a human at the end of the chain to interpret these signals (whatever representation they’ve been given), there is no meaning whatsoever.

So an AI that simply recycles its own input ad infinitum might produce something but it won’t be meaningful to us humans. Hence, it’s unlikely to be useful apart from the novelty of it.

plokiju|2 years ago

What is it about humans that makes only them capable of meaning? And how would you define meaning here?