top | item 42561026

(no title)

JaDogg | 1 year ago

I think LLM web applications need a big red warning (non interactive, I don't want more cookie dialogs) like in cigarettes.

> LLM generated content need to be verified.

discuss

order

becquerel|1 year ago

Every LLM web app I have used has a disclaimer along these lines prominently featured in the UI. Maybe the disclaimer isn't bright red with gifs of flashing alarms, but the warnings are there for the people who would pay attention to them in the first place.

minimaxir|1 year ago

Unfortunately, even after 2 years of ChatGPT and countless news stories about it, people still don't realize that LLMs can be wrong.

There maybe should be a bright red flashing disclaimer at this point.