top | item 40914167

(no title)

circularfoyers | 1 year ago

Do you support local LLMs? I could only see an ENV for OPENAI_API_KEY.

discuss

order

Manik_agg|1 year ago

Hey another co-founder of Tegon here, currently we only support Open AI models but plan to add local models support with Olamma soon.

ranger_danger|1 year ago

llama.cpp supports being an OpenAI API server, you would just need to change the URL in the Tegon source.