top | item 34620312

(no title)

jb_s | 3 years ago

I've been trying to figure out how the ChatGPT UI manages to keep the conversation context when it's over the limits of what the model can ingest.

I even tried asking ChatGPT :D

discuss

order

gamegoblin|3 years ago

Its context window is quite large -- 8192 tokens, where a token is about ~4 characters. But it's quite possible they are using GPT itself to summarize the older parts of the conversation so they can fit more in by only keeping the important bits.

jb_s|3 years ago

I was thinking something like that, bearing in mind that humans can't remember every single detail about a conversation either.