Because it doesn't understand or have intelligence. It just knows correlations, which is unfortunately very good for fooling people. If there is anything else in there it's because it was explicitly programmed in like 1960's AI.
I disagree. AI in 1960s relied on expert systems where each fact and rule was handcoded by humans. As far as I know LLMs learn on their own on vast bodies of text. There is some level of supervision, but it is bot 1960s AI. That is the reason we get hallucinations as well.
Expert systems are more accurate as they rely on first order logic.
terminalcommand|1 year ago
Expert systems are more accurate as they rely on first order logic.