top | item 40266481

Ask HN: Are Nvidia H100s that good?

4 points| jonathanlei | 1 year ago

I was playing around with some different GPUs yesterday and put all of the results here: https://www.tensordock.com/benchmarks

I tried a vLLM and Resnet training workload. The H100 outperforms the A100 about 45% to 80% consistently, but it isn’t that much faster…

What workloads would see the most speedup, because I’m really not seeing 3x+ any on vLLM or simple training workloads?

5 comments

order

zer00eyz|1 year ago

You're doing inference, not training.

https://lambdalabs.com/gpu-benchmarks

utopcell|1 year ago

Interesting summarization. For throughput/watt, nothing beats A100 40GB PCIe cards. In terms of throughput/$, 4090 cards are >8X better than the best H100.

jonathanlei|1 year ago

Hmm I did include a training workload as the second chart. My test workload was relatively small so I guess if the workload I ran spends a bit less GPU time comparatively to the CPU, given equal CPU for all workloads, would be an equalizing factor.

But even looking at the Lambda Labs benchmarks, I am surprised that the H100 PCIE barely outperforms the A100 SXM, for example. And it is meant to be a replacement for the A100 PCIE. 20% generational improvement yes, but I would have expected more?