I think this is mostly right, but also I'm not sure I agree completely with the premise. Humans have years of conversations they've heard before they attempt to read or write. They already have a concept of what a 'dog' is before they see the word, and know what it is likely to do. Not the same with something that only sees text.
tolerance|8 months ago
My initial comment was going to make an abstract reference to how human beings are pretty much wired for reasoning from the time that they're being breastfed, or at least reared in the clutch of their mother. It has something to do with the impression I've picked up of how the inheritance of a language, and subsequently literacy, starts with your mom—in ideal cases.
I don't know if this is a strike against humans in the whole argument for efficiency. But I don't think it does.
Computers don't have Moms. Go Moms.
techpineapple|8 months ago
Then maybe the other thing is that rules and relationships must be encoded in a special way. In LLM’s I assume rules are emergent, but maybe we have a specific rules engine that gets trained based on the emotional salience of what we read/hear.
Maybe another reason is what’s encoded in our DNA, which might imagine our brain structure is fundamentally designed for some of this stuff.