top | item 44833584

(no title)

6Az4Mj4D | 6 months ago

As I was reading that prompt, it looked like large blob of if else case statements

discuss

order

refactor_master|6 months ago

Maybe we can train a simpler model to come up with the correct if/else-statements for the prompt. Like a tug boat.

otabdeveloper4|6 months ago

Hobbyists (random dudes who use LLM models to roleplay locally) have already figured out how to "soft-prompt".

This is when you use ML to optimize an embedding vector to serve as your system prompt instead of guessing and writing it out by hand like a caveman.

Don't know why the big cloud LLM providers don't do this.

MaxLeiter|6 months ago

This is generally how prompt engineering works

1. Start with a prompt

2. Find some issues

3. Prompt against those issues*

4. Condense into a new prompt

5. Go back to (1)

* ideally add some evals too