top | item 45057524

(no title)

mi12-root | 6 months ago

Thanks for the feedback! When you say “custom,” do you mean additional integrations with LLM providers, or more documentation on how to build your own custom integration? If you mean the former, we’re currently focused on stabilizing the API and reaching feature parity with FoundationModels (e.g., adding streaming). After that, we plan to add more integrations, such as Claude, Gemini, and on-device LLMs from Hugging Face.

discuss

order

jdmg94|6 months ago

There is no examples or documentation on `CustomLLM` the README file has examples on `SystemLLM` and `OpenaiLLM` but there's no way for us to know if we need to bring in guff files, ollama, hugginface, etc.