top | item 44279638

(no title)

FINDarkside | 8 months ago

I think it's very widely accepted definition and there's really no competing definitions either as far as I know. While some people might think AGI means superintelligence, it's only because they've heard the term but never bothered to look up what it means.

discuss

order

simonw|8 months ago

OpenAI: https://openai.com/index/how-should-ai-systems-behave/#citat...

"By AGI, we mean highly autonomous systems that outperform humans at most economically valuable work."

AWS: https://aws.amazon.com/what-is/artificial-general-intelligen...

"Artificial general intelligence (AGI) is a field of theoretical AI research that attempts to create software with human-like intelligence and the ability to self-teach. The aim is for the software to be able to perform tasks that it is not necessarily trained or developed for."

DeepMind: https://arxiv.org/abs/2311.02462

"Artificial General Intelligence (AGI) is an important and sometimes controversial concept in computing research, used to describe an AI system that is at least as capable as a human at most tasks. [...] We argue that any definition of AGI should meet the following six criteria: We emphasize the importance of metacognition, and suggest that an AGI benchmark should include metacognitive tasks such as (1) the ability to learn new skills, (2) the ability to know when to ask for help, and (3) social metacognitive abilities such as those relating to theory of mind. The ability to learn new skills (Chollet, 2019) is essential to generality, since it is infeasible for a system to be optimized for all possible use cases a priori [...]"

The key difference appears to be around self-teaching and meta-cognition. The OpenAI one shortcuts that by focusing on "outperform humans at most economically valuable work", but others make that ability to self-improve key to their definitions.

Note that you said "AI that will perform on the level of average human in every task" - which disagrees very slightly with the OpenAI one (they went with "outperform humans at most economically valuable work"). If you read more of the DeepMind paper it mentions "this definition notably focuses on non-physical tasks", so their version of AGI does not incorporate full robotics.

bluefirebrand|8 months ago

Doesn't the "G" in AGI stand for "General" as in "Generally Good at everything"?

neom|8 months ago

I think the G is what really screws things up. I thought it was, as good as the general human, but upon googling it has a defined meaning among researchers. There appears to be confusion all over the place tho.

General-Purpose (Wide Scope): It can do many types of things.

Generally as Capable as a Human (Performance Level): It can do what we do.

Possessing General Intelligence (Cognitive Mechanism): It thinks and learns the way a general intelligence does.

So, for researchers, general intelligence is characterized by: applying knowledge from one domain to solve problems in another, adapting to novel situations without being explicitly programmed for them, and: having a broad base of understanding that can be applied across many different areas.

adastra22|8 months ago

Yes, but “good at” here has a very limited, technical meaning, which can be oversimplified as “better than random chance.”

If something can be better than random chance in any arbitrary problem domain it was not trained on, that is AGI.