top | item 44723561

(no title)

pulkitsh1234 | 7 months ago

Is there any website to see the minimum/recommended hardware required for running local LLMs? Much like 'system requirements' mentioned for games.

discuss

order

svachalek|7 months ago

In addition to the tools other people responded with, a good rule of thumb is that most local models work best* at q4 quants, meaning the memory for the model is a little over half the number of parameters, e.g. a 14b model may be 8gb. Add some more for context and maybe you want 10gb VRAM for a 14gb model. That will at least put you in the right ballpark for what models to consider for your hardware.

(*best performance/size ratio, generally if the model easily fits at q4 you're better off going to a higher parameter count than going for a larger quant, and vice versa)

nottorp|7 months ago

> maybe you want 10gb VRAM for a 14gb model

... or if you have Apple hardware with their unified memory, whatever the assholes soldered in is your limit.

CharlesW|7 months ago

> Is there any website to see the minimum/recommended hardware required for running local LLMs?

LM Studio (not exclusively, I'm sure) makes it a no-brainer to pick models that'll work on your hardware.

knowaveragejoe|7 months ago

If you have a HuggingFace account, you can specify the hardware you have and it will show on any given model's page what you can run.