(no title)
rishabhpoddar | 29 days ago
- Using UUIDs in the prompt (which can happen if you serialise a data structure that contains UUIDs into a prompt): Just don't use UUIDs, or if you must, then map them onto unique numbers (in memory) before adding them to a prompt
- Putting everything in one LLM chat history: Use sub agents with their own chat history, and discard it after sub agent finishes.
- Structure your system prompt to maximize input cache tokens: You can do this by putting all the variable parts of the system prompt towards the end if it, if possible.
No comments yet.