top | item 46923170

(no title)

gdad | 23 days ago

I experienced this as well. The built-in memory features are a black-box. Can't see what's stored, can't control when it's recalled and takes random irrelevant examples from past conversations into unrelated conversations. Also, I am into using multiple models, so it was frustrating they can't talk to each other.

The .md file approach is cool but like you mentioned, it is an overhead and doesn't scale well.

I have been building something in this space (Maximem Vity [https://maximem.ai]) that tries to solve this more systematically: a cross-LLM, cross-app memory layer that sits in a secure cloud vault. The idea is you control what gets stored and you can summon specific pieces of it into any AI session, granularly. So instead of hoping GPT remembers that you prefer TypeScript and work at a fintech startup, you explicitly pull that context in wherever you need it.

discuss

order

No comments yet.