(no title)
ryan_lane | 1 month ago
Sure, the AI isn't directly doing the scamming, but it's supercharging the ability to do so. You're making a "guns don't kill people, people do" argument here.
ryan_lane | 1 month ago
Sure, the AI isn't directly doing the scamming, but it's supercharging the ability to do so. You're making a "guns don't kill people, people do" argument here.
seizethecheese|1 month ago
only-one1701|1 month ago
ryan_lane|1 month ago
Obviously the intended use and design of AI isn't to scam the elderly, but it's extremely efficient at doing it, and has no guard rails to help prevent it.
Why is anyone allowed to make a digital copy of me, without my permission, and then use that to call my relatives? It should be illegal to use it and it should be illegal to even generate it. Sure, it's already illegal to defaud people, but that's simply not enough at this point. The AI companies producing these models should be held liable for this form of fraud, as they're not providing any form of protection.
You're exactly the person that this article is satirizing.
wk_end|1 month ago
NicuCalcea|1 month ago
burnto|1 month ago
criley2|1 month ago
Phones are also a very popular mechanism for scamming businesses. It's tough to pull off CEO scams without text and calls.
Therefore, phones are bad?
This is of course before we talk about what criminals do with money, making money truly evil.
only-one1701|1 month ago
Without Generative AI, we couldn’t…?
JumpCrisscross|1 month ago
Phones are utilities. AI companies are not.