top | item 40708759

(no title)

rajman187 | 1 year ago

Meta is already working on this [1], not sure it can replace NVIDIA for training large models within that time frame however. The ecosystem around their chips is what gives a huge competitive advantage, not having to build entire libraries and optimize a ton of code goes a long way in adoptions and retention.

N.B.: there are other 3rd-party competitors like Cerebras [2] who offer all-in-one solutions for their giant wafers along with libraries and data centers but I’m not sure behemoths would migrate to these offerings either

[1] https://ai.meta.com/blog/next-generation-meta-training-infer...

[2] https://www.cerebras.net/

discuss

order

No comments yet.