(no title)
jlei523 | 2 years ago
Also, it seems like the letter was faked?
Anyways, it doesn't matter if it's faked. It does make people notice that it's a real problem now.
General consensus amongst data scientists of when each AI breakthrough will happen have almost always been wrong. AI breakthroughs tend to happen earlier than what the consensus predict. I don't see AGI predictions as any different.
sph|2 years ago
We need to solve AGI now? Don't get me wrong, it would be an incredible scientific breakthrough, but I cannot see how AGI would solve our current societal and human problems. In fact, it would just upturn our entire world.
AGI is a tool we want, not the solution we need.
And I agree with OP anyway. It is pure hype. Thinking we are closer to AGI because of LLM is just like thinking we are closer to the Moon because we have conquered the Everest peak.
jlei523|2 years ago
No, we need to solve the problem of AI alignment, regulation, etc now.
https://arxiv.org/pdf/2209.00626.pdf