top | item 39683055

(no title)

robmck | 1 year ago

We totally hear you and have been thinking through the best way to support this!

Question for you: would the app running locally be sufficient, or would the LLM also need to run on-premise?

discuss

order

afro88|1 year ago

Not OP but I would be happy with the app running locally and providing an enterprise OpenAI API key

robmck|1 year ago

This is a great piece of feedback, and getting just this to work would not be too difficult. Thanks so much!

internet101010|1 year ago

Not person you responded to but for me it needs to be able to connect to ollama api.