top | item 38118312 (no title) rrsp | 2 years ago What I’m saying is if it is possible to train transformers to achieve AGI, then why hasn’t it happened yet? What’s the limitation that will be overcome in the next 5 years? discuss order hn newest famouswaffles|2 years ago Because training takes time (months), money and hardware. It's not like this is some instantaneous process.Nobody has any knowledge of the "magic number" of size and data before "AGI" so people train increasingly large models.Bigger models are in the process of being trained. They will continue to be until they no longer get better.
famouswaffles|2 years ago Because training takes time (months), money and hardware. It's not like this is some instantaneous process.Nobody has any knowledge of the "magic number" of size and data before "AGI" so people train increasingly large models.Bigger models are in the process of being trained. They will continue to be until they no longer get better.
famouswaffles|2 years ago
Nobody has any knowledge of the "magic number" of size and data before "AGI" so people train increasingly large models.
Bigger models are in the process of being trained. They will continue to be until they no longer get better.