(no title)
namin
|
1 year ago
This seems cool! Is there a way to try it locally with an open LLM? If you provide a way to set the OpenAI server URL and other parameters, that would be enough. Is the API_URL server documented, so a mock a local one can be created?Thanks.
alishobeiri|1 year ago
alishobeiri|1 year ago
Will update once I add Ollama support too!
westurner|1 year ago
> https://news.ycombinator.com/item?id=38355385 : LocalAI, braintrust-proxy; [and promptfoo, chainforge]
(Edit)
From "Show HN: IPython-GPT, a Jupyter/IPython Interface to Chat GPT" https://news.ycombinator.com/item?id=35580959#35584069