The excitement isn't the capabilities of the model, it's how efficiently it was created. One of the major lessons in AI in the last couple of years was that scale mattered - you would want to throw more and more compute at a problem and that has turned into incredible share prices for Nvidia and incredible investments in data centre and energy generation. If it turns out that actually we didn't need quite such incredible scale to get these results and actually we were just missing some really quite basic efficiency optimizations then the entire investment cycle into Nvidia, data centres and energy generation is going to whipsaw in an incredible way.
Balgair|1 year ago
nejsjsjsbsb|1 year ago