(no title)
EricMausler | 8 months ago
It's easy to forget that the conversation itself is what the LLM is helping to create. Humans will ignore or depriotitize extra information. They also need the extra information to get an idea of what you're looking for in a loose sense. The LLM is much more easily influenced by any extra wording you include, and loose guiding is likely to become strict guiding
furyofantares|8 months ago
Maybe not very often in a chat context, my experience is in trying to build agents.