top | item 40508798

(no title)

DubiousPusher | 1 year ago

Pretty good. Despite my high scepticism of the technology I have spent the last year working with LLMs myself. I would add a few things.

The LLM is like another user. And it can surprise you just like a user can. All the things you've done over the years to sanitize user input apply to LLM responses.

There is power beyond the conversational aspects of LLMs. Always ask, do you need to pass the actual text back to your user or can you leverage the LLM and constrain what you return?

LLMs are the best tool we've ever had for understanding user intent. They obsolete the hierarchies of decision trees and spaghetti logic we've written for years to classify user input into discrete tasks (realizing this and throwing away so much code has been the joy of the last year of my work).

Being concise is key and these things suck at it.

If you leave a user alone with the LLM, some users will break it. No matter what you do.

discuss

order

umangrathi|1 year ago

This has been really interesting read. Aligned that if you leave a user along with LLM, some one will break. Hence we choose to use large number of templates wherever suitable as compared to a free reign for LLM to respond with.

distalx|1 year ago

In my opinion, using templates can help keep responses reliable. But it can also make interactions feel robotic, diminishing the "wow" factor of LLMs. There might be better options out there that we haven't found yet.

photon_collider|1 year ago

>The LLM is like another user. And it can surprise you just like a user can. All the things you've done over the years to sanitize user input apply to LLM responses.

I really like this analogy! That sums up my experiences with LLMs as well.

Terr_|1 year ago

> The LLM is like another user.

I like to think of LLMs as client-side code, at least in terms of their risk-profile.

No data you put into them (whether training or prompt) is reliably hidden from a persistent user, and they can also force it to output what they want.