top | item 45577079

(no title)

agubelu | 4 months ago

Even assuming that's the case, everyone's acting like throwing more GPUs at the problem is somehow gonna get them to AGI

discuss

order

atleastoptimal|4 months ago

Far more is being done than simply throwing more GPU's at the problem.

GPT-5 required less compute to train than GPT-4.5. Data, RL, architectural improvements, etc. all contribute to the rate of improvement we're seeing now.

4gotunameagain|4 months ago

The very idea that AGI will arise from LLMs is ridiculous at best.

Computer science hubris at its finest.