top | item 32767864

(no title)

Tenobrus | 3 years ago

The main interesting thing about GPT-3 is that prompt engineering turns it into an incredibly general tool. This summary was likely generated with a prompt something like:

Here's an article about x: <text of sample article>

A quick summary of the article, focusing on the main relevant points and keeping critical detail: <sample summary>

as an example, then duplicating that with the real article and text.

That's the "few shot learning" from the original paper. It could also be that it's good enough at summarizing specifically that you don't need an example, just the right framing prompt around the article text. Either way, that kind of prompt engineering is how you get "text completion" to perform basically any text processing task, generate code or play tic tac toe, so on

discuss

order

No comments yet.