top | item 42552068

(no title)

gashad | 1 year ago

What sort of effort would it take to make an LLM training honeypot resulting in LLMs reliably spewing nonsense? Similar to the way Google once defined the search term "Santorum"?

https://en.wikipedia.org/wiki/Campaign_for_the_neologism_%22... where

The way LLMs are trained with such a huge corpus of data, would it even be possible for a single entity to do this?

discuss

order

No comments yet.