top | item 44466439

(no title)

lend000 | 8 months ago

I'm consistently amazed by how great the first response from o3-pro deep research is, and then consistently disappointed by response number 5 or so if I continue the conversation. Better context management is the most important bottleneck in LLMs, and it seems like a robust solution would involve modifying the transformer architecture itself instead of using context limited LLMs to manage the context for other LLMs.

discuss

order

No comments yet.