(no title)
yathaid | 1 year ago
There are only two objective measurements needed:
-is it making progress towards its goal?
-is it able to acquire capabilities it didn't have previously?
I am not sure if even the first one is objective enough.
Dismissing the argument without stating why you aren't convinced just comes across as a form of AI ludditism.
randomNumber7|1 year ago
I think something that only learns to reproduce text, can not become an intelligent actor.
It's necessary to act in an environment with Feedback.
And while it of course depends on the definition of intelligence, the article is about the Gödel machine, which is a fancy word for AGI
ben_w|1 year ago
We don't know the extent of our ignorance about intelligence.
> I think something that only learns to reproduce text, can not become an intelligent actor.
> It's necessary to act in an environment with Feedback.
Ok, but text adventures are a thing, so that doesn't rule out learning from text.
And all RHLF has humans as part of the environment and giving feedback (that's the H and the F in RLHF).
whatshisface|1 year ago
ben_w|1 year ago
Sure, you can also say that GPT-4's passing the Bar tells you it can pass the kind of questions in the Bar exam without that extending to the kind of questions actual lawyers need to do, Goodhart's law remains if that was your point?