top | item 42621845

(no title)

oogali | 1 year ago

It’s doable as it’s what I use to experiment.

Ollama + CodeGPT IntelliJ plugin. It allows you to point at a local instance.

discuss

order

mark_l_watson|1 year ago

I also use Ollama for coding. I have a 32G M2 Mac, and the models I can run are very useful for coding and debugging, as well as data munging, etc. That said, sometimes I also use Claude Sonnet 3.5 and o1. (BTW, I just published an Ollama book yesterday, so I am a little biassed towards local models.)

matrix12|1 year ago

Thanks for the book!