top | item 40914167 (no title) circularfoyers | 1 year ago Do you support local LLMs? I could only see an ENV for OPENAI_API_KEY. discuss order hn newest Manik_agg|1 year ago Hey another co-founder of Tegon here, currently we only support Open AI models but plan to add local models support with Olamma soon. hrpnk|1 year ago There is also COHERE_API_KEY for the vector storage & search: https://github.com/tegonhq/tegon/blob/158b54af8d6f7cf4195c61... ranger_danger|1 year ago llama.cpp supports being an OpenAI API server, you would just need to change the URL in the Tegon source.
Manik_agg|1 year ago Hey another co-founder of Tegon here, currently we only support Open AI models but plan to add local models support with Olamma soon.
hrpnk|1 year ago There is also COHERE_API_KEY for the vector storage & search: https://github.com/tegonhq/tegon/blob/158b54af8d6f7cf4195c61...
ranger_danger|1 year ago llama.cpp supports being an OpenAI API server, you would just need to change the URL in the Tegon source.
Manik_agg|1 year ago
hrpnk|1 year ago
ranger_danger|1 year ago