What you call "AI" is generally named AGI. LLM's are alredy a kind of AI, not just generic enough to fully replace all humans.
We don't know if full AGI can be built using just current technology (like transformers) given enough scale, or if 1 or more fundamental breakthroughs are needed beyond just the scale.
My hypothesis has always been that AGI will arrive roughly when the compute power and model size matches the human brain. That means models of about 100 trillion params, which is not that far away now.
trashtester|1 year ago
We don't know if full AGI can be built using just current technology (like transformers) given enough scale, or if 1 or more fundamental breakthroughs are needed beyond just the scale.
My hypothesis has always been that AGI will arrive roughly when the compute power and model size matches the human brain. That means models of about 100 trillion params, which is not that far away now.
chx|1 year ago
We absolutely do and the answer is such a resounding no it's not even funny.