top | item 42891900 (no title) teeth-gnasher | 1 year ago Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy. discuss order hn newest riskable|1 year ago > I’d damn sure not bet my life on it not hallucinating.One would think that if you asked it to help you make drugs you'd want hallucination as an outcome. lukan|1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
riskable|1 year ago > I’d damn sure not bet my life on it not hallucinating.One would think that if you asked it to help you make drugs you'd want hallucination as an outcome. lukan|1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
lukan|1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
riskable|1 year ago
One would think that if you asked it to help you make drugs you'd want hallucination as an outcome.
lukan|1 year ago
But no. Only a very, very small percentage of drug users want hallucinations.
Hallucinations happen usually, when something went bad.
(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)