top | item 41458044

Show HN: Claude Memory – Long-term memory for Claude

76 points| deshraj | 1 year ago |github.com

23 comments

order

twothamendment|1 year ago

I was really hoping that this was about Claude remembering that it has already crawled every page we have and downloaded every image many times over!

ggnore7452|1 year ago

Side note: I feel like ChatGPT's long-term memory isn't implemented properly. If you check the 'saved memories,' they are just bad.

deshraj|1 year ago

100% agree. I have seen similar issues related to both quality and performance of ChatGPT Memory feature.

Shameless plug: We have been working on this problem at Mem0 to solve the long-term memory problem with LLMs. GitHub: https://github.com/mem0ai/mem0

imranq|1 year ago

Nice. I think in the future this could be way better if everything was local and didn't require a API key. As far as I can tell mem0 is a fancy retrieval system. It could probably work pretty well locally with simpler models

deshraj|1 year ago

Yes, you can run Mem0 locally since we have open sourced it but would need some more work to have a server up and running to be able to interact with Claude. GitHub: https://github.com/mem0ai/mem0

chipdart|1 year ago

I'm a long time Claude user.

Instead of long-term memory I'd be happy if it had short-term reliability. I lost count the number of times this week that Claude failed to process prompts because it was down.

Tostino|1 year ago

Completely agree on the reliability front...but I don't think mentioning it on some guy's 3rd party GitHub project is going to help all that much with that.

kromem|1 year ago

Are you using mobile?

I've noticed a bug where long conversations timeout on new sends on mobile because of processing time, but in reality the prompt is sent and responded to, it just doesn't show up until you leave and return to the conversation.

pigeons|1 year ago

How long has claude been around, I didn't know there were long time users

quantadev|1 year ago

I always wonder what the heck people are thinking when they invent some cool AI feature and implement it for one specific LLM since we already have the technology/libraries to make most anything you want to do be able to work with most any LLM. (For you pedantic types, feel free to point out the exceptions).

Personally I use LangChain/Python for this, and that way any new AI features I create therefore easily work across ALL LLMs, and my app just lets the end user pick the LLM they want to run on. Every feature I have works on every LLM.

BoorishBears|1 year ago

I wonder what the heck you're going on about when this is literally a Chrome extension that hooks into the DOM of a specific LLM's frontend.

Doubly baffling since the underlying project does support LLMs and this is clearly just a showcase piece.

decide1000|1 year ago

Where can I download the Firefox extension?

deshraj|1 year ago

It only support Chrome for now. I built this in few hours quickly to solve my problem. Happy to accept contributions to the repository if someone builds it.

shmatt|1 year ago

Just ask Claude to convert it