top | item 35329737

(no title)

throwawaytemp29 | 2 years ago

You can run into issues with long running threads. It only has so much context (I think around 4K words). It doesn’t have all the context of all the thread. My understanding is that it is summarizing the thread and resubmitting it as part of the prompt each time.

It took me a while to figure out why it kept forgetting things from earlier in the thread.

I’m looking forward to when you can have a bigger context or explicitly set some sort of context that is persistent.

discuss

order

yosito|2 years ago

Oh, that's interesting. I hadn't noticed it forgetting context, but I just asked "Do you remember the original prompt I gave you when starting this conversation?" and it said "Yes, the original prompt you gave me was: 'Hello, can you help me convert 0.05 BTC to USD?'". So, yeah, it's definitely forgetting some things.

satvikpendem|2 years ago

32k words with GPT-4, no? That's what they said in the demo video.

Hedepig|2 years ago

32k _tokens_. There are approx. 3/4 tokens per word, so that's around 24k words

Yiin|2 years ago

only with api and only if you're invited to one.

Hedepig|2 years ago

> think around 4K words

4k _tokens_, 1 token is approx. 3/4 a word. So it's roughly 3k words