top | item 42686925

(no title)

larwent | 1 year ago

I’ve been using something similar called Twinny. It’s a vscode extension that connects to an ollama locally hosted LLM of your choice and works like CoPilot.

It’s an extra step to install Ollama, so not as plugnplay as tfa but the license is MIT which makes it worthwhile for me.

https://github.com/twinnydotdev/twinny

discuss

order

No comments yet.