(no title)
scohesc | 1 year ago
Bing's AI told me a picture of a (what I now know) stinging nettle in my backyard was a hemp plant - I gave it several different pictures at different angles, and it was confident it was hemp.
So I cracked the stem, touched the liquid inside, wiped sweat off my forehead - and eventually stinging, swelling, burning sensation on my hands and forehead.
I asked Google AI and it correctly identified it as a stinging nettle plant the first time.
Amusingly enough, I told Bing AI it was wrong, it's actually stinging nettle, and I have been physically harmed by your response - it immediately ended the chat, didn't go on further to say anything like "here's some help, call poison control, here's some remedies", literally NOTHING. (though, if it's messed up this much, i don't think I'd ask for further help from it!)
AI is a toy - it's not ready for any real use or identification purposes. It's a shame that these companies are so strapped for cash they're rushing like madmen to deploy this new "forefront" of technology that they don't stop to think that they're inadvertently hurting people because of their decisions.
It's sad, someone is going to do something potentially even more dangerous and risky, trusting the AI's that these companies make, they'll get even more hurt (or die!) and these companies will still be able to hide behind "it's the algorithm!", or "you saw the disclaimer!"
And the politicians will keep allowing this to happen because shareholders and money.
HeatrayEnjoyer|1 year ago
tonetegeatinst|1 year ago