top | item 32852648

(no title)

turkishmonky | 3 years ago

GPT-3 codex has a limit of 8k tokens, which is roughly 32k characters - so you are limited to that much context at a maximum.

With the time it takes to return results with that much context, I wouldn't be suprised if they generally limit context to less than that though.

discuss

order

No comments yet.