(no title)
porphyra | 1 month ago
I also did scroll through the public grok feed and the AI generated bikini pics were mostly Onlyfans creators requesting their own fans to generate these pictures (or sometimes generating them themselves).
porphyra | 1 month ago
I also did scroll through the public grok feed and the AI generated bikini pics were mostly Onlyfans creators requesting their own fans to generate these pictures (or sometimes generating them themselves).
cptaj|1 month ago
You know this but somehow are rationalizing this game changing fact away.
Yes, people can draw and photoshop things. But it takes time, skill, dedication, etc. This time cost is load bearing in the way society needs to deal with the tools it has for the same reason at the extreme that kitchen knives have different regulations than nuclear weapons.
It is also trivially easy for grok to censor this usage for the vast majority of offenders by using the same LLM technology they already have to classify content created by their own tools. Yes, it could get jailbroken but that requires skill, time, dedication, etc; And it can be rapidly patched, greatly mitigating the scale of abuse.
Y-bar|1 month ago
The scale of effect and barrier to entry. Both are orders of magnitude easier and faster. It would take hours of patience and work to mostly create one convincing fake using photoshop, once you had spent the time and money to learn the tool and acquire it. This creates a natural large moat to the creation process. With Groom it takes a minute at most with no effort or energy needed.
And then there is the ease of distribution to a wide audience, X/Groom handles that for you by automatically giving you an audience of millions.
It’s like with guns. Why prevent selling weapons to violent offenders when they could just build their own guns from high quality steel, a precision drill, and a good CNC machine? Scale and barrier to entry are real blockers for a problem to mostly solve itself. And sometimes a 99% solution is better than no solution.
porphyra|1 month ago
Permit|1 month ago
It's not obvious to me that this is your position. What safeguards do you propose as an alternative to those discussed in the article?
porphyra|1 month ago
But I'm not sure if the tool itself should be banned, as some people seem to be suggesting. There are content creators on the platform that do use NSFW image generation capabilities in a consensual and legitimate fashion.
kranke155|1 month ago
But for NSFW work it dominates. It’s clearly deliberate.
ImPostingOnHN|1 month ago
When you use a service like Grok now, the service is the one using the tool (Grok model) to generate it, and thus the service is producing CSAM. This would also apply if you paid someone to use Photoshop to produce CSAM: they would be breaking the law in doing so.
This is setting aside the issue of twitter actually distributing the CSAM.
Jordan-117|1 month ago
drawfloat|1 month ago
ekjhgkejhgk|1 month ago
If an individual invented a tool that can generate such pictures, he'd be arrested immediately. A company does it, it's just a woopsie. And most people don't find this strange.
Waterluvian|1 month ago
I think this is an important question to ask despite the subject matter because the subject matter makes it easy for authorities to scream, "think of the children you degenerate!" while they take away your freedoms.
I think Musk is happy to pander to and profit from degeneracy, especially by screaming, "it's freedom of speech!" I would bet the money in my pocket that his intent is that he knows this stuff makes him more money than if he censored it. But he will of course pretend it's about 1A freedoms.
cosmic_cheese|1 month ago
unknown|1 month ago
[deleted]
array_key_first|1 month ago
I would say lots of ways. And that's probably why I have a few knives, and zero atomic bombs.
XorNot|1 month ago
pphysch|1 month ago
This could be easily fixed by making the generated images sent through private Grok DMs or something, but that would harm the bottom line. Maybe they will do that eventually once they have milked enough subscriptions from the "advertising".
simianwords|1 month ago
caconym_|1 month ago
We are going to be in some serious fucking trouble if we can't tackle these issues of scale implied by modern information technology without resorting to disingenuous (or simply naive) appeals to these absurd equivalences as justification for each new insane escalation.
etchalon|1 month ago
kllrnohj|1 month ago
Last I checked Photoshop doesn't have a "undress this person" button? "A person could do bad thing at a very low rate, so what's wrong with automating it so that bad things can be done millions of times faster?" Like seriously? Is that a real question?
But also I don't get what your argument is, anyway. A person doing it manually still typically runs into CSAM or revenge porn laws or other similar harassment issues. All of which should be leveraged directly at these AI tools, particularly those that lack even an attempt at safeguards.
glemion43|1 month ago
It could easily be solved by basic age verification.
The csam stuff though needs to be filtered and fixed as this breaks laws and I'm not aware what would make it legal, lucky enough