top | item 40443723

(no title)

psynister | 1 year ago

Check out Ollama, it's built to run models locally. Llama3 8b runs great locally for me, 70b is very slow. Plenty of options.

discuss

order

No comments yet.