top | item 47132001

I turned off ChatGPT's memory

19 points| Garbage | 7 days ago |every.to

13 comments

order

dinkleberg|7 days ago

I'd be surprised to hear that lots of people don't do just this. As soon as the memory features came out I gave them a quick try and quickly turned them off.

I don't want to be held accountable to all of my previous ideas. I want each conversation to start fresh with the context that I provide. If I am exploring some library in one language stack and then I later want to look into something completely different, I don't want the conversation polluted by what it thinks I want based on the previous discussion.

I suppose for those who use it as a companion the memory is a core element. But when used as a tool it gives a significantly worse experience IME.

prawn|7 days ago

I always use it without logging in ‡, and make a point of starting each fresh chat in a fresh window so its perspective in one conversation isn't tainted by some random line of thinking from before. Can't stand when it makes assumptions about my intentions because I mentioned having kids before, or had talked about a different property or venture or whatever.

‡ I'd tried logging in recently and immediately it started nagging me to upgrade. Went back to using without an account and bizarrely the situation is far better.

0_____0|7 days ago

Yeah account-level memory is a real mixed bag. I do like Anthropic's project scoped memory, that actually is useful because you get to decide what chats are useful to a given problem space.

OutOfHere|7 days ago

Leaving memory enabled risks introducing bias and baggage from past chats and experiences that are better off being left behind. It however then becomes my responsibility to tell ChatGPT everything it needs to know in every chat.

Note that there are other sources of bias besides memory. For example, for each Custom GPT, an OpenAI staff Karens looks to type up and insert custom instructions that apply with a higher priority, and often are detrimental to the output, thereby sabotaging the Custom GPT. It would be better off if they didn't do any of it. I have had multiple Custom GPTs that initially worked flawlessly get sabotaged by OpenAI in this way.

sshine|7 days ago

Memories inside projects are a boon: If you can assume that all your questions are related, and you instruct the engine what memories to store and how to store them (short single statements), you can get a powerful tool out of it. I use it for habit tracking. The system prompt is taking the style of Atomic Habits. It will remember exactly what habits I'm trying to add, what my preferences are for apps, when I sleep, what other goals I have unrelated to my current query.

theshrike79|7 days ago

Scoped memory is a good thing, I _want_ ChatGPT to cross-reference stuff when I'm in my Homeassistant project so I don't need to keep re-explaining the basic setup every time.

But when I do a one-off question, I don't want it assuming stuff because of something I discussed with it 6 months ago.

sshine|7 days ago

Concrete memories in a generic context suck:

Every time I ask Claude Kubernetes question, it relates it to that one time I was deploying Kubernetes on VMs. It's not helpful!

Every time I ask Claude a question related to projects, it starts mentioning where I work, as if that helps.

Asking it to forget these concrete memories sometimes results in remembering to not mention them.

Thoroughly useless.

I started out disabling it because it's creepy. I tried it out, and it just doesn't work well.

dSebastien|7 days ago

I rather have memory and ideas in Markdown and pick what to include in the context rather than relying on platform specific memory features

pllbnk|7 days ago

Not sure about ChatGPT but Claude does that. You can see what it keeps in the memory, edit the memory to your liking and it also kind of constantly evolves.

Hizonner|7 days ago

Yeah, "memory" just means "context pollution", at least for the general chat interface.

glimshe|7 days ago

First thing I do with every AI. I don't want a buddy, I want a better search and analysis engine.