top | item 42619013

(no title)

chychiu | 1 year ago

Despite the flak Nvidia gets for VRAM sizes, 5090 seems to be a decent offer. Can’t say the same about the rest of the line.

FWIW, it appears the best homelab set up would still be 2-4x 3090 if you want VRAM for LLMs, but a single 5090 would likely be the best in class for prosumers on any less VRAM heavy tasks such as image / video generation or deep RL research

discuss

order

ehsankia|1 year ago

I'm curious, what flake are they getting? I assumed they limit gaming GPU VRAM specifically to avoid having them abused by AI people.