Exactly this. These systems are supposed to have been built by some of the smartest scientific and engineering minds on the planet, yet they somehow failed (or chose not) to think about second-order effects and what steady-state outcomes their systems will have. That's engineering 101 right there.
This phrase almost always seems to be invoked to attribute purpose (and more specifically, intent and blame) to something based on outcomes, where it should be more considered as a way to stop thinking in terms of those things in the first place.
This is satire. Its purpose is to use exaggeration to provide comedy while also drawing attention to issues.
Obviously the intended use and design of AI isn't to scam the elderly, but it's extremely efficient at doing it, and has no guard rails to help prevent it.
Why is anyone allowed to make a digital copy of me, without my permission, and then use that to call my relatives? It should be illegal to use it and it should be illegal to even generate it. Sure, it's already illegal to defaud people, but that's simply not enough at this point. The AI companies producing these models should be held liable for this form of fraud, as they're not providing any form of protection.
You're exactly the person that this article is satirizing.
No one - neither the author of the article nor anyone reading - believes that Sam Altman sat down at his desk one fine day in 2015 and said to himself, “Boy, it sure would be nice if there were a better way to scam the elderly…”
An no one believes that Sam Altman thinks of much more than adding to his own wealth and power. His first idea was a failing location data-harvesting app that got bought. Others have included biometric data-harvesting with a crypto spin, and this. If there's a throughline beyond manipulative scamming, I don't see it.
There are legitimate applications - fixing a tiny mistake in the dialogue in a movie in the edit suite, for instance.
Do these legitimate applications justify making these tools available to every scammer, domestic abuser, child porn consumer, and sundry other categories of criminal? Almost certainly not.
Fair, but it’s an exaggerated statement that’s supposed to clue us into the tone of the piece with a chuckle. Maybe even a snicker or giggle! It’s not worth dissecting for accuracy.
only-one1701|1 month ago
the_snooze|1 month ago
rcxdude|1 month ago
irjustin|1 month ago
This is the knife-food vs knife-stab vs gun argument. Just because you can cook with a hammer doesn't make it its purpose.
xigoi|1 month ago
ryan_lane|1 month ago
Obviously the intended use and design of AI isn't to scam the elderly, but it's extremely efficient at doing it, and has no guard rails to help prevent it.
Why is anyone allowed to make a digital copy of me, without my permission, and then use that to call my relatives? It should be illegal to use it and it should be illegal to even generate it. Sure, it's already illegal to defaud people, but that's simply not enough at this point. The AI companies producing these models should be held liable for this form of fraud, as they're not providing any form of protection.
You're exactly the person that this article is satirizing.
wk_end|1 month ago
username223|1 month ago
NicuCalcea|1 month ago
rgmerk|1 month ago
Do these legitimate applications justify making these tools available to every scammer, domestic abuser, child porn consumer, and sundry other categories of criminal? Almost certainly not.
burnto|1 month ago