top | item 35826736

(no title)

Mike_12345 | 2 years ago

Yes, price = quality because they require supercomputing resources to train. GPT-3 required hundreds of Tesla GPUs running for several weeks. That's millions of dollars just for hardware, not including power (the GPUs cost $15k each)

discuss

order

sashank_1509|2 years ago

You’re right but I’d just like to add, GPT-3 probably required 1000’s of GPUs. OpenAI is known to have the largest cluster 16k+ A100 GPUs and most of them were used for the major model training.