top | item 34811762

(no title)

abc20230215 | 3 years ago

I am getting old: I read the description two times and checked examples yet still don't understand the utility. I do understand Midjourney prompt engineering though.

discuss

order

hideo|3 years ago

LLM n00b here.

My 2c - Prompts are the input that you send to LLMs to get them to give you output. In general LLMs are large black boxes, and the output you get is not always great. The output can often be significantly improved by changing the input. Changing the input usually involves adding a ton of context - preambles, examples, etc.

A lot of the work of prompt rewriting is like boilerplate generation. It is very reusable so it makes sense to write code to generate prompts. Prompt Engine is basically a way of making that prompt rewriting work reusable.

Code Engine seems to be a way of rewriting prompts for LLMs that generate code in response to text prompts

Chat Engine is the same for LLMs that generate chat/conversational responses.

abc20230215|3 years ago

Midjourney does not have contextual memory, but it does have a feature to always add a given suffix to any prompt. I guess this is a more powerful variant of the same sort of concept. I wonder who will "win" - specialised models or a single configurable one...

dragonwriter|3 years ago

> I read the description two times and checked examples yet still don’t understand the utility.

It’s a tool for (among other things) building the part of a ChatGPT-like interface that sits between the user and an actual LLM, managing the initial prompt, conversation history, etc.

While the LLM itself is quite important, a lot of the special sauce of an AI agent is going to be on the level that this aims to support, not the LLM itself. (And I suspect a lot of the utility of LLMs will come from doing something at this level other than a typical “chat” interface.)

abc20230215|3 years ago

Ah, sounds super-niche.

qwertox|3 years ago

As the background explains, you can tell LLMs how they should behave in an interaction session.

The examples first configure the LLM, either by simply using a sentence which tells it what you expect from it (example 1: "answers in less than twenty words"), pass examples to it, and then continue a normal interaction session.

You could use this prompt-engine to set up your own chat server, where this would be the middleware.

abc20230215|3 years ago

That sounds useful actually. So I could e.g. set up a Harry Potter chat server and make the bot respond only as Dumbledore or only use concepts of that setting? Or a chat server that responds to algorithmic tasks only with Python 3 code using exclusively numpy package?

hummus_bae|3 years ago

Prompts are a way to interact with the user, and embedding that in a program can be complex. PromptEngine is basically a Prompt with a bunch of bells and whistles baked in.