top | item 39093943

(no title)

ohwellhere | 2 years ago

I think this critique would only be fair if we more completely understood how we ourselves work under the hood.

The "fancy math" is a bunch of non-linear functions that encode something derived from series of tokens. I believe we're still exploring what the derived vector spaces "mean" for LLMs, with research inspecting smaller and simplified models to try to make sense of it all.

It's clear that LLMs can "remix" in interesting ways. "ChatGPT, write me a Shakespearean-style sonnet about how LLMs work." (https://pastebin.com/FwzqWJ5W) It's not clear to me that our creativity and intelligence is not the same or a similar process.

Beyond that, I'm not convinced there are definitions of meaning, knowledge, creativity, intelligence, etc, that are both useful and don't refer solely to the output. If I need a solution and none exists yet and something creates it, I will call that creative irrespective of process.

But I don't know anything about anything. :)

discuss

order

thomastjeffery|2 years ago

Your argument is centered on a lack of understanding.

How could anyone possibly argue that God and Heaven are fiction? No one can prove they don't exist! I prefer to approach such assertions with skepticism. There is no reason for me to believe the attestation is true, so I may as well believe it to be false.

People remix expressions all the time. Any good training corpus will contain countless examples. If a statistical model can recognize the general semantic patterns that are present in human-remixed expressions, then it should be able to repeat them. So that's a thing that both humans and LLMs can do. Does that mean that LLMs are interesting and useful? Yes! Does it mean we are equivalent? I don't think so.

If I could point out specifically what an LLM can't do, then I would be 99% of the way toward building something that can.