(no title)
gdad | 23 days ago
The .md file approach is cool but like you mentioned, it is an overhead and doesn't scale well.
I have been building something in this space (Maximem Vity [https://maximem.ai]) that tries to solve this more systematically: a cross-LLM, cross-app memory layer that sits in a secure cloud vault. The idea is you control what gets stored and you can summon specific pieces of it into any AI session, granularly. So instead of hoping GPT remembers that you prefer TypeScript and work at a fintech startup, you explicitly pull that context in wherever you need it.
No comments yet.