top | item 44939997

(no title)

andreashaerter | 6 months ago

Same for me. Big nope.

Investing quite a lot of time to figure out hosting LLMs locally in comparable quality without investing too much money (as smaller environments will never have a large budget for this).

discuss

order

tygra|6 months ago

Obviously you need decent hardware to run LLMs locally, but you don’t need a super high-end computer just to host qwen3:30b or gpt-oss:20b. Those models are already pretty solid for writing and coding.