ChatGPT 3.5 in response to a brief analysis of the community gave out:
"It seems like the conversation you provided is discussing the idea of compressing prompts or summaries of larger documents in order to fit them into a smaller context. While the idea has potential, it appears that people have had difficulty reproducing it and that lookup tables and external storage of memory may be more effective in the long term. Additionally, there is discussion about the possibility of greatly increasing context size in the future, with the example given that there was an almost 10x jump between GPT-3.5-turbo and GPT-4 in terms of token capacity."
No comments yet.