top | item 45492672

(no title)

j4hdufd8 | 4 months ago

GPUs are also used to speed up inference (the math is virtually the same). You think your ChatGPT queries are running on x86 servers?

discuss

order

ralfn|4 months ago

But do you think with the profit margins of NVidia, others won't be offering competing chips? Google already has their own for example.

From that perspective the notion that NVidia will own this AI future while others such as AMD and Intel standby, would be silly.

Im already surprised it took this long. The NVidia moat might he software, but not anything that warrants these kind of margins at this scale. It is likely there will be strong price competition on hardware for inference.

wqaatwt|4 months ago

> You think your ChatGPT queries are running on x86 servers

What makes you think? Or are all non Nvidia GPUs x86?