(no title)
cmrdporcupine | 3 days ago
But frankly I feel like the founders of Anthropic and others are victim of the same hallucination.
LLMs are amazing tools. They play back & generate what we prompt them to play back, and more.
Anybody who mistakes this for SkyNet -- an independent consciousness with instant, permanent, learning and adaptation and self-awareness, is just huffing the fumes and just as delusional as Lemoine was 4 years ago.
Everyone of of us should spend some time writing an agentic tool and managing context and the agentic conversation loop. These things are primitive as hell still. I still have to "compact my context" every N tokens and "thinking" is repeating the same conversational chain over and over and jamming words in.
Turns out this is useful stuff. In some domains.
It ain't SkyNet.
I don't know if Anthropic is truly high on their own supply or just taking us all for fools so that they can pilfer investor money and push regulatory capture?
There's also a bad trait among engineers, deeply reinforced by survivor bias, to assume that every technological trend follows Moore's law and exponential growth. But that applie[s|d] to transistors, not everything.
I see no evidence that LLMs + exponential growth in parameters + context windows = SkyNet or any other kind of independent consciousness.
overgard|3 days ago
austinjp|3 days ago
Every step on the journey towards SkyNet is worse than the preceding step. Let's not split hairs about which step we're on: it's getting worse, and we should stop that.
overgard|3 days ago
cmrdporcupine|3 days ago
They'll ink deals with all sorts of nefarious parties and be involved in all sorts of dubious things while trumpeting their fake non-profit status and wringing their hands about imminent AGI and "alignment" of the created AIs.
The concern I have is not the alignment of the AIs. They're not capable of having one, no matter what role playing window dressing they put on it.
It's the alignment of Anthropic and the people who use their tools that is a concern. So far it seems f*cked.