top | item 42927854

(no title)

RandomBK | 1 year ago

Yup. I was referring to the 1.58B quant which seemed to be performing alright and would be the smallest real-DeepSeek model. That requires ~140GB, which is just barely doable on a 128GB RAM + 24GB VRAM setup + a lot of patience. Others have made it work at 64GB RAM + a fast SSD.

The true minimally-quantized DeepSeek experience will need one or possibly two 8xH100 nodes, so well upwards of $100K in CapEx.

discuss

order

No comments yet.