I think that's probably the shim I was referring to - it has hardcoded context length, but it is either implemented incorrectly, Anthropic ignores it, or maybe it's on openwebui to manage the window and it just isn't? Not sure. I found it kept getting slow, so I was starting new conversations to work around that. Eventually I got suspicious and checked - I'd burned through almost $100 within a few hours.LibreChat isn't as nice in some areas, but it's much more efficient in this regard.
wkat4242|1 year ago
And no it doesn't cost extra credits, isn't ignored and doesn't have hardcoded context length. It works perfectly.
emptiestplace|1 year ago
Also, it's pretty easy to find unresolved bugs related to openwebui not handling context length parameters correctly - I believe I actually read something from the author saying that this parameter is effectively disabled (for non-local LLMs maybe?).