top | item 39853377

(no title)

declaredapple | 1 year ago

> companies like OpenAI have had access to large quantities of H100 for a few months now and Sora is being presented

From what I could tell from Nvidia's recent presentation, Nvidia works directly with OpenAI to test their next gen hardware. IIRC they had some slides showing the throughput comparisons with Hopper and Blackwell, suggesting they used OpenAI's workload for testing.

H100's have been generally available (not a long waitlist) for only several months, but all the big players had them already 1 year ago.

I agree with you, but I think you might be 1 generation behind.

> OpenAI used H100’s predecessor — NVIDIA A100 GPUs — to train and run ChatGPT, an AI system optimized for dialogue, which has been used by hundreds of millions of people worldwide in record time. OpenAI will be using H100 on its Azure supercomputer to power its continuing AI research.

March 21, 2023 https://nvidianews.nvidia.com/news/nvidia-hopper-gpus-expand...

discuss

order

GaggiX|1 year ago

Very interesting, I guess it does make sense that GPT-4 was also trained on the Hopper architecture.