top | item 40636933

(no title)

alishobeiri | 1 year ago

Definitely, it is something we are super focused on as it seems to be a use case that is important for folks. Opening up the proxy server and adding local LLM support is my main focus for today and will hopefully update on this comment when it is done :)

discuss

order

alishobeiri|1 year ago

I just added the ability to run the proxy locally: https://github.com/squaredtechnologies/thread/commit/7575b99...

Will update once I add Ollama support too!

anakaine|1 year ago

Ollama support would be amazing. There's a stack of people in organizations (data rich places) who would likely love something like this, but who cannot get to OpenAI due to organizational policies.

namin|1 year ago

Awesome, thank you! I'll check it out.