(no title)
rnjesus | 2 years ago
the good thing is that you can tell it to be more concise, less bot-like, not so apologetic, etc., and it’ll trim down its responses for as long as it can “remember” (whatever the token limit is on persistence). it’ll be great when llms have, by default, a bit of persistent memory to remember things like this
Miraste|2 years ago
ChatGPT already has this (not sure if it's Plus-only). In your user profile, there's a "Custom Instructions" entry where you can give it persistent prompts.
rnjesus|2 years ago
bredren|2 years ago