top | item 44888400

(no title)

jdonaldson | 6 months ago

Humans make mistakes too. Case in point, the hallucination engine didn't tell the person to ingest bromide. It only mentioned that it had chemical similarities to salt. The human mistakenly adopted a bit of information that furthered his narrative. The humans touting and bigging it up are still the problem.

discuss

order

wzdd|6 months ago

Could you provide a source for your statements? The article says that they don’t have access to the chat logs, and the quotes from the patient don’t suggest that chatgpt did not tell him to ingest bromide.

thunderfork|6 months ago

We don't have the log from this case, so we don't know what chatgippity said, whether it was "chemical similarities" or "you should consume bromium... now!"