top | item 38655698

(no title)

ldhough | 2 years ago

> random factoids

The "random factoids" were verbatim training data though, one of their extractions was >1,000 tokens in length.

> GPT4 never merely regurgitated

I interpreted the claim that it can't "regurgitate training data" to mean that it can't reproduce verbatim a non-trivial amount of its training data. Based on how I've heard the word "regurgitate" used, if I were to rattle off the first page of some book from memory on request I think it would be fair to say I regurgitated it. I'm not trying to diminish how GPT does what it does, and I find what it does to be quite impressive.

discuss

order

No comments yet.