top | item 35979075

(no title)

akio | 2 years ago

> I really don't understand why people ascribe emergent behaviors to these systems such as "theory of mind". It's just randomness. I'm presuming there's just a lot of psycological pareidolia going around.

Ah yes, "just randomness":

----

From the above text, generate these variations:

1) For each word in the text count the number of letters in the word and append the count to the word in parenthesis.

2) Write each word backwards.

3) Write a similarly structured text that argues the opposite of the original text.

----

1) I (1) really (5) don't (4) understand (10) why (3) people (6) ascribe (7) emergent (8) behaviors (9) to (2) these (5) systems (7) such (4) as (2) "theory (6) of (2) mind" (4). It's (4) just (4) randomness (10). I'm (3) presuming (9) there's (6) just (4) a (1) lot (3) of (2) psychological (12) pareidolia (10) going (5) around (6).

2) I yllaer t'nod dnatsrednu yhw elpoep ebircsa tnegrime sroivaheb ot eseht smetsys hcus sa ".dnim fo yroeht" s'tI tsuj ssenmodnar m'I gnimuserp s'ereht tsuj a tol fo lacigolohcysp ailodierap gniodnuora

3) I absolutely comprehend how people can accurately identify emergent behaviors in these systems like "theory of mind". It's far from random. I'm convinced that these are genuine insights being recognized.

discuss

order

re|2 years ago

Did you notice that it made multiple errors for the first two variations? [Edit:] What's particularly interesting to me is the inconsistent counting of apostrophes for #1, and the handling of "theory of mind" for #2 -- though there are simpler errors besides these.

soulofmischief|2 years ago

It has to do with the way text is tokenized into discrete groups and not individual characters. One way to try overcoming this problem is to ask the model to break each word into individual spaced characters as an intermediary step.

daveguy|2 years ago

Not to mention a third one isn't any sort of argument, just a statement of the opposite belief.