I thought it was a really interesting interview. If an ML system has been trained to identify whether or not a picture has a cat in it, it is very tempting to anthropomorphize this and say “the system has learned what cats are” but this is not what is really happening from the model’s point of view. The system doesn’t know what the difference between a picture of cat is vs a real cat or how cats behave or what mammals are or anything. It just knows the data in and yes/no out.
Modern AI that can generate text, pictures or videos are truly phenomenal accomplishments but adding larger training sets, structural complexity and output capabilities does not really seem to be getting any closer to a General AI. Something with the framework for a level of agency that could say “wait, what exactly are these cat things I keep getting asked about?”
The argument goes that, without the ability to self construct that higher level framework and then question the integrity of that mental model, current generation AI’s will always act erratically (at least from a human perspective).
carrolldunham|3 years ago
You write this out as if this is an idea you formed yourself and not a tired/expired talking point - it's such a funny comment because it's pure regurgitation, but the argument in it is against regurgitation being "real thought". So are you a bot? Or are you intending to actually proving that there's no difference between ML and 'real' intelligence by showing you do what ML does? Mind spinning stuff
chazzyluc|3 years ago
Honestly, I was kinda hoping to hear some better constructed and well reasoned insights & counter points beyond your “regurgitated mind swill” hot take but I guess that’s on me for not managing expectations.
Ah well, I usually comment once a year or so and then get immediately reminded why I stay off HN. See y’all in 2024!
mensetmanusman|3 years ago
See, it’s turtles all the way down :)
cwalv|3 years ago
If I understand based on my own experience of 'consciousness' that "I" exist, even if everything else I perceive is a simulation or poorly encoded representation of reality, is that understanding 'regurgitation'?