(no title)
ashraful | 3 years ago
Each prompt-response is independent, and it doesn't have a real understanding of the historical context of the conversation. This means that when you talk to the chatbot, the entire conversation history needs to be included in the prompt to give it context.
For instance, if your first prompt is "Who is the current sitting president of the United States?" and the response is "Joe Biden," and you then ask a follow-up prompt like "How tall is he?", GPT-3 won't understand what you're referring to unless the prior conversation is included in the prompt, like so:
```
User: Who is the current sitting president of the United States?
AI: Joe Biden
User: How tall is he?
AI:
```
By doing this, GPT-3 can use the context of the previous conversation to generate an appropriate response. However, as the conversation history gets longer, GPT-3's ability to maintain context weakens, and it may begin to provide irrelevant or nonsensical responses.
There is a soft limit of around 2000 tokens, which translates to approximately 1500 words. When the conversation history becomes too long, GPT-3 will summarize it in a way that preserves the important information. For example, "Who is the current sitting president of the United States?" might become "Who's the president of USA" to save space.
For most users, including the entire conversation history in the prompt is feasible for up to six prompts before context begins to degrade. After that point, it may be necessary to restate previous information or use other methods to maintain context.
While I believe that ChatGPT and Bing likely have more advanced methods for retaining conversation history, based on the recent limitations implemented, it seems that the general principle of maintaining context still holds true.
ashraful|3 years ago
I had a long conversation with chatgpt, asking it about a book, and provide plot summaries, and asking about specific characters. After about 40-50 questions, I asked it questions like how many questions I have asked in this conversation and what they were. Eventually I was able to get it to print out the questions I asked, and they all got shortened (a lot!).