top | item 40668952

(no title)

oelang | 1 year ago

Microsoft recently announced that they run chatgpt 3.5 & 4 on mi300 on Azure and the price/performance is better.

https://www.amd.com/en/newsroom/press-releases/2024-5-21-amd...

discuss

order

sigmoid10|1 year ago

I've used ChatGPT on Azure. It sucks on so many levels, everything about it was clearly enforced by some bean counters who see X dollars for Y flops with zero regard for developers. So choosing AMD here would be about par for the course. There is a reason why everyone at the top is racing to buy Nvidia cards and pay the premium.

dexterac|1 year ago

"Everyone" at top is also developing their own chips for inference and providing APIs for customers to not worry about using CUDA.

It looks like the price to performance of inference tasks gives providers a big incentive to move away from Nvidia.