top | item 46489985

(no title)

andreyandrade | 1 month ago

  I see current AIs as tools—a sophisticated lathe, not a thinking partner. The question isn't whether it "knows" anything.

  The interesting question is: why does AI with correct information in its weights still give wrong answers? That's an engineering problem, not a metaphysics problem.

  But here's what bothers me about the "AI doesn't truly know" argument: do we? When a senior dev answers "use Kubernetes" without asking about team size or user count, are they "comprehending" or pattern-matching on what sounds authoritative? The AI failure I described is identical to what I see in human experts daily.

  Maybe the flaw isn't unique to AI. Maybe it's a mirror.

discuss

order

allears|1 month ago

Why not both? It's certainly true that human 'experts' often rely on pattern-matching without fully understanding a problem. But AI has no understanding at all, so pattern matching is its only skill, whereas human capacity for understanding isn't only greater than AI, it's fundamentally different. In what ways? That seems to be the multi-trillion dollar question.