top | item 38224446 (no title) aschleck | 2 years ago 1979 16 bit flops on an H100 is with sparsity. See footnote 2 on https://www.nvidia.com/en-us/data-center/h100/. You should be halving it for non-sparse flops. discuss order hn newest YetAnotherNick|2 years ago GP is correct. With sparsity it is 3958. 1979 Tflop/s is without sparsity. emu|2 years ago No, it is not. That's the sparse fp8 flop number, but you need to ignore sparsity and compare bf16 flops not fp8 flops for the comparison the ancestor post is making.
YetAnotherNick|2 years ago GP is correct. With sparsity it is 3958. 1979 Tflop/s is without sparsity. emu|2 years ago No, it is not. That's the sparse fp8 flop number, but you need to ignore sparsity and compare bf16 flops not fp8 flops for the comparison the ancestor post is making.
emu|2 years ago No, it is not. That's the sparse fp8 flop number, but you need to ignore sparsity and compare bf16 flops not fp8 flops for the comparison the ancestor post is making.
YetAnotherNick|2 years ago
emu|2 years ago