(no title)
mzhaase | 5 days ago
LLMs miss very important concepts, like the concept of a fact. There is no "true", just consensus text on the internet given a certain context. Like that study recently where LLMs gave wrong info if there was the biography of a poor person in the context.
steve1977|5 days ago
And of course they also miss things like embodiment, mirror neurons etc.
If an LLM makes a mistake, it will tell you it is sorry. But does it really feel sorry?
red75prime|5 days ago
And what does it mean to feel sorry? Beyond fallible and imprecise human introspective notion of "sorry", that is. A definition that can span species and computing substrates. A deanthropomorphized definition of "sorry", so to speak.
joquarky|4 days ago
dnautics|5 days ago