top | item 43262268

(no title)

npace12 | 1 year ago

It's not much different than your proxy idea. It's implemented as a transformation between the internal message structure (close to Anthropic API's) to OpenAI message spec and vice-versa. Then, it's calling all the other models using the openai-node client, as pretty much everyone supports that now (openrouter, ollama, etc)

discuss

order

haolez|1 year ago

Cool! The only downside would be keeping your code in sync with upstream, then.