top | item 46062759

(no title)

anonymousDan | 3 months ago

I don't understand when people blame AI for buying DDR5 DRAM - aren't they mostly interested in HBM? Or is the fab space being diverted to manufacture more HBM than DDR DRAM previously?

discuss

order

mistercheph|3 months ago

Inference, don't need gpu's for inference. Frontier labs are eking out progress by scaling up inference-time compute. Pre-training scaling has kind of stalled / giving diminishing returns (for now).