(no title)
exmadscientist | 1 month ago
Fuzzing the details because that's not the conversation I want to have, I asked if I could dose drug A1, which I'd just been prescribed in a somewhat inconvenient form, like closely related drug A2. It screamed at me that A1 could never have that done and it would be horrible and I had to go to a compounding pharmacy and pay tons of money and blah blah blah. Eventually what turned up, after thoroughly interrogating the AI, is that A2 requires a more complicated dosing than A1, so you have to do it, but A1 doesn't need it so nobody does it. Even though it's fine to do if for some reason it would have worked better for you. Bot the bot thought it would kill me, no matter what I said to it, and not even paying attention to its own statements. (Which it wouldn't have, nothing here is life-critical at all.) A frustrating interaction.
No comments yet.