(no title)
0xy | 6 days ago
It is correct to say it is mathematically impossible, as all the people making AGI claims rely upon advances that are not even theoretical, they have not even been discovered yet, and the mere possibility of them is questioned by many scientists.
LLMs have hard and soft limits all over the place preventing AGI. You aren't gonna train and loop yourself to AGI because the compute does not exist, and will not exist.
My 4+ year point was for a single memory fab. Increasing capacity by merely 5% (generous assumption) takes 4 years and $10bn. It's starting to sound like the path to AGI in the current paradigm will cost infinite dollars and take infinite years of build-out.
Even with a transformational efficiency breakthrough, you still have hard limits all over the place. Where are you going to store all the data? Memory constraints again.
No comments yet.