top | item 35039129

(no title)

sdfgdfghj | 3 years ago

Is it safe to assume that anything which comes out of ChatGPT is not novelty but an existing concept or existing concept with some nuances?

discuss

order

fellerts|3 years ago

No, because ChatGPT doesn't understand "concepts". It helps to understand that all it does is take your prompt and generate the most likely words that will satisfy the user. This is how it was trained.

That doesn't mean it won't regurgitate preexisting material (ask it to recount the constitution, for example), but it can definitely connect words in way that describes novel ideas.

int_19h|3 years ago

It has the capacity to model things behind the words, which is what "understanding concepts" ultimately means. That it is capable of doing so from merely being trained to "predict the next token" is what's remarkable about it.

https://thegradient.pub/othello/

ChatGTP|3 years ago

Kind of, existing things only with 0.8 points of randomness injected for "creativity", I'm overly simplifying of course but it is kind of like that.

I really recommend trying to get through this if you can: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-...

It raises some questions for me about how far this current model can actually scale / evolve without getting a bit too far out to be useful for human use. But it can (sometimes) connect things in ways that people have likely never seen before.

Maybe if you just clicked the button all day, you'd wind up with some really impressive new games people have never seen before.

mercer|3 years ago

yes, just like everything that is novel

azatom|3 years ago

It depends how good your layer is.

If the internet is America the continent, where are lots of humans work, you can call the colonization as a genocide to native americans, or stealing the land with honor.

kqr|3 years ago

Can we ask the same thing about humans?