top | item 43275598

(no title)

edtech_dev | 1 year ago

Running LLMs is more of a nice to have. If I can run something like DeepSeek-Coder-V2 even if it's a bit slow, I'll be happy.

discuss

order

hn92726819|1 year ago

If you have a powerful computer at home, you can also offload your ai work to it. It's still local in the sense it's your computer, but it would require network access.