top | item 44938026

(no title)

tygra | 6 months ago

Interesting.

If you can run an open-source LLM locally on your own computer, completely offline, and use it for legal, finance, or medical topics, would you still say no to that?

discuss

order

kingstnap|6 months ago

I can and have when I was required to. It was slow and had worse results than I had hoped. Probably because I dont have enough VRAM for the big open source models so I was using 8b ones.

I left this bit out because my original comment was getting long but I think it's important to be respectful over others' privacy wishes. So I didn't use an API when it concerns other people.

tygra|6 months ago

Nice.

Of course, you need decent hardware to run LLMs locally, but you don’t need a super high-end computer to host qwen3:30b or gpt-oss:20b. You don’t even need a GPU for those models, as long as you’ve got a modern CPU. And they’re already pretty solid for writing and coding.