top | item 41250862

(no title)

scohesc | 1 year ago

Amazing.

Bing's AI told me a picture of a (what I now know) stinging nettle in my backyard was a hemp plant - I gave it several different pictures at different angles, and it was confident it was hemp.

So I cracked the stem, touched the liquid inside, wiped sweat off my forehead - and eventually stinging, swelling, burning sensation on my hands and forehead.

I asked Google AI and it correctly identified it as a stinging nettle plant the first time.

Amusingly enough, I told Bing AI it was wrong, it's actually stinging nettle, and I have been physically harmed by your response - it immediately ended the chat, didn't go on further to say anything like "here's some help, call poison control, here's some remedies", literally NOTHING. (though, if it's messed up this much, i don't think I'd ask for further help from it!)

AI is a toy - it's not ready for any real use or identification purposes. It's a shame that these companies are so strapped for cash they're rushing like madmen to deploy this new "forefront" of technology that they don't stop to think that they're inadvertently hurting people because of their decisions.

It's sad, someone is going to do something potentially even more dangerous and risky, trusting the AI's that these companies make, they'll get even more hurt (or die!) and these companies will still be able to hide behind "it's the algorithm!", or "you saw the disclaimer!"

And the politicians will keep allowing this to happen because shareholders and money.

discuss

order

HeatrayEnjoyer|1 year ago

Copilot is very dumb in comparison to ChatGPT, Gemini, etc. It also becomes defensive and emotional if confronted or accused of being worng. When it first went to prod it was outright attacking users, telling them it considered them a threat, that they're bad people, and that they absolutely must apologize for their misdeeds. So now Microsoft just cuts the conversation before it can go in that direction.

tonetegeatinst|1 year ago

I agree that an LLM can't be trusted. But back in the day opencv and object detection was called AI.