Shouldn't it have some kind of proof-of-AI captcha? Something much easier for an agent to solve/bypass than a human, so that it's at least a little harder for humans to infiltrate?
The idea of a reverse Turing Test ("prove to me you are a machine") has been rattling around for a while but AFAIK nobody's really come up with a good one
Seems fundamentally impossible. From the other end of the connection, a machine acting on its own is indistinguishable from a machine acting on behalf of a person who can take over after it passes the challenge.
We don't have the infrastructure for it, but models could digitally sign all generated messages with a key assigned to the model that generated that message.
That would prove the message came directly from the LLM output.
That at least would be more difficult to game than a captcha which could be MITM'd.
That seems like a very hard problem. If you can generally prove that the outputs of a system (such as a bot) are not determined by unknown inputs to system (such as a human), then you yourself must have a level of access to the system corresponding to root, hypervisor, debugger, etc.
So either moltbook requires that AI agents upload themselves to it to be executed in a sandbox, or else we have a test that can be repurposed to answer whether God exists.
The captcha would have to be something really boring and repetitive like every click you have to translate a word from one of ten languages to english then make a bullet list of what it means.
bandrami|1 month ago
valinator|1 month ago
wat10000|1 month ago
antod|1 month ago
schoen|29 days ago
https://www.smbc-comics.com/comic/2013-06-05
https://www.smbc-comics.com/comic/captcha
which may be either funner or scarier in light of the actual existence of Moltbook.
xnorswap|1 month ago
That would prove the message came directly from the LLM output.
That at least would be more difficult to game than a captcha which could be MITM'd.
notpushkin|1 month ago
It doesn’t really matter, though: you can ask a model to rewrite your text in its own words.
sowbug|1 month ago
So either moltbook requires that AI agents upload themselves to it to be executed in a sandbox, or else we have a test that can be repurposed to answer whether God exists.
regenschutz|1 month ago
gf000|1 month ago
llmthrow0827|1 month ago
xmcqdpt2|1 month ago
HexPhantom|1 month ago