top | item 34812084

(no title)

hideo | 3 years ago

LLM n00b here.

My 2c - Prompts are the input that you send to LLMs to get them to give you output. In general LLMs are large black boxes, and the output you get is not always great. The output can often be significantly improved by changing the input. Changing the input usually involves adding a ton of context - preambles, examples, etc.

A lot of the work of prompt rewriting is like boilerplate generation. It is very reusable so it makes sense to write code to generate prompts. Prompt Engine is basically a way of making that prompt rewriting work reusable.

Code Engine seems to be a way of rewriting prompts for LLMs that generate code in response to text prompts

Chat Engine is the same for LLMs that generate chat/conversational responses.

discuss

order

abc20230215|3 years ago

Midjourney does not have contextual memory, but it does have a feature to always add a given suffix to any prompt. I guess this is a more powerful variant of the same sort of concept. I wonder who will "win" - specialised models or a single configurable one...