top | item 44941409

(no title)

tygra | 6 months ago

Obviously you need decent hardware to run LLMs locally, but you don’t need a super high-end computer just to host qwen3:30b or gpt-oss:20b. Those models are already pretty solid for writing and coding.

discuss

order

No comments yet.