top | item 41460167

(no title)

deshraj | 1 year ago

Yes, you can run Mem0 locally since we have open sourced it but would need some more work to have a server up and running to be able to interact with Claude. GitHub: https://github.com/mem0ai/mem0

discuss

order

Eisenstein|1 year ago

I think you misunderstood what the parent commenter meant. I believe they were talking about running the AI locally, like with llamacpp or koboldcpp or vllm.

I checked your documentation and the only way I can find to run mem0 is with a hosted model. You can use the OpenAI API, which many local backends can support, but I don't see a way to point it at localhost. You would need to use an intermediary service to intercept OpenAI API calls and reroute them to a local backend unless I am missing something.