(no title)
diffeomorphism | 1 month ago
> Local backend server with full API Local model integration (vLLM, Ollama, LM Studio, etc.) Complete isolation from cloud services Zero external dependencies
Seems open source/open weight to me. They additionally offer some cloud hosted version.
No comments yet.