top | item 45792287

(no title)

brutus1213 | 3 months ago

I recently got a 5090 with 64 GB of RAM (intel cpu). Was just looking for a strong model I can host locally. If I had performance of GPT4-o, I'd be content. Are there any suggestions or cases where people got disappointed?

discuss

order

bogtog|3 months ago

GPT-OSS-20B at 4- or 8-bits is probably your best bet? Qwen3-30b-a3b probably the next best option. Maybe there exists some 1.7 or 2 bit version of GPT-OSS-120B

p1esk|3 months ago

5090 has 32GB of RAM. Not sure if that’s enough to fit this model.

svnt|3 months ago

It should fit enough of the layers to make it reasonably performant.