top | item 43742982

(no title)

or_am_i | 10 months ago

Analogy would have been correct if prompting didn't influence the output (which I hope you agree is not the case).

And yes, the model keeps changing under you -- much like a horse is changing under a jockey, forcing them to adapt. Or like formula drivers and different car brands.

You can absolutely improve the results by experimenting with prompting, by building a mental mode of what happens inside the "black box", by learning what kinds of context it has/does not have, how (not) to overburden it with instructions etc. etc.

discuss

order

No comments yet.