top | item 45597218

(no title)

dayvid | 4 months ago

Prompt engineering is a transitory phase. Embedding it into existing tools so the 80-90% of regular prompt patterns can be worked into the UI (or contextual UI designed around how a user uses the product) is the next step

discuss

order

isoprophlex|4 months ago

Yeah if you mean that LLMs are used with UIs on top that allow "too much magic", I agree.

Free form chat is pretty terrible. People just want the thing to (smartly) take actions. One or two buttons that do the thing, no prompting involved, is much less complicated.

gubicle|4 months ago

The whole point is the prompt (+ a static set of (system)prompts). If your whole function as a human is clicking one of a set of buttons to trigger an AI action, then you are automate-able in a few lines of code (and the AI is better than you at deciding which button to click anyway (supposedly)).

There are like thousands wrappers around LLMs masquerading as AI apps for specialized usecases, but the real performance of these apps is really only bottlenecked by the LLM performance, and their UIs generally only get in the way of the direct LLM access/feedback loop.

To work with LLMs effectively you need to understand how to craft good prompts, and how to read/debug the responses.