top | item 44400553

(no title)

fasthands9 | 8 months ago

I think this is mostly right, but also I'm not sure I agree completely with the premise. Humans have years of conversations they've heard before they attempt to read or write. They already have a concept of what a 'dog' is before they see the word, and know what it is likely to do. Not the same with something that only sees text.

discuss

order

tolerance|8 months ago

I agree with you 100% and I'm not sure if it contradicts my point that humans have a natural advantage over LLMs in the way I tried to illustrate.

My initial comment was going to make an abstract reference to how human beings are pretty much wired for reasoning from the time that they're being breastfed, or at least reared in the clutch of their mother. It has something to do with the impression I've picked up of how the inheritance of a language, and subsequently literacy, starts with your mom—in ideal cases.

I don't know if this is a strike against humans in the whole argument for efficiency. But I don't think it does.

Computers don't have Moms. Go Moms.

techpineapple|8 months ago

Yeah one thing I’ve wondered (and maybe they do this) but find ways to cross encode different kinds of data, words yes, but auditory and visual data too. The algorithms to do this might be complicated (or incomprehensible) but for sure lots of creativity say comes from the interrelationship between senses, combine that with emotion as well, and I imagine it partially comes down to, our writing ability isn’t limited to the collection of what we’ve read.

Then maybe the other thing is that rules and relationships must be encoded in a special way. In LLM’s I assume rules are emergent, but maybe we have a specific rules engine that gets trained based on the emotional salience of what we read/hear.

Maybe another reason is what’s encoded in our DNA, which might imagine our brain structure is fundamentally designed for some of this stuff.