(no title)
nevon
|
6 days ago
There's zero percent chance that I would proxy all my LLM calls with my API key through some third party service. However, if it was self-hostable, so that I can ensure it is only able to reach the LLM providers, I could see deploying this behind an LLM provider router. If it actually achieves the kind of token use reduction that is advertised, that would be worth paying for - especially in the enterprise. I'm skeptical of using it for product integrations, where prompts are tuned for effectiveness and efficiency, but for ad-hoc usage it probably doesn't matter too much if the phrasing affects the results a bit.
christalingx|6 days ago
christalingx|6 days ago