top | item 45686830

(no title)

uncletaco | 4 months ago

You’re asking ChatGPT for advice to stop drone attacks? Does that mean people die if it hallucinates a wrong answer and that isn’t caught?

discuss

order

jamesmishra|4 months ago

No, I don't need ChatGPT's help for the basics of air defense.

Military technologies are validated before deployed. Nobody can die from a hallucination.

But if I want to understand, say, how a particular Russian drone works, ChatGPT can help me piece together information from English, Russian, and Ukrainian-language sources.

But sometimes ChatGPT's safety filter thinks I want to use the Russian drone instead of stopping it, in which case it doesn't want to help.

withinboredom|4 months ago

This happens in real life too. I’ll never forget an LT walking in and asking a random question (relevant but he shouldn’t have been asking on-duty people) and causing all kinds of shit to go sideways. An AI is probably better than any lieutenant.