top | item 46578374

(no title)

sixQuarks | 1 month ago

Users showed that Gemini and openAI also undress people, it’s not just grok.

discuss

order

hn_throwaway_99|1 month ago

Can you provide a link with evidence of that? I haven't seen that reported.

I'd also note in advance there is a big difference in someone figuring out how to jailbreak Gemini or OpenAI, and then the companies responding swiftly to fix that, than what has been reported with Grok where it was basically wide open to create those images.

Amezarak|1 month ago

Grok has never been "wide open" to undress people. Anyone reporting that is being extremely duplicitous. Any image-to-image on Grok has stringent NSFW filters for exactly this scenario. People have worked out jailbreaks and those get dealt with.

wasabi991011|1 month ago

Gemini and ChatGPT conversations are private, not public. A big part of the controversy over Grok is that it's happening in public on Twitter, often as direct replies to the user who's picture is being manipulated.

ctoth|1 month ago

This is such a weird issue for me, who is blind. Did Grok undress people, or did Grok show extrapolated images of what people might look like undressed? The "undressed people" framing makes it sound like people physically had their clothes removed. Obviously this did not happen.

But, like.

If I have like ... a mole somewhere under my clothes, Grok cannot know about that right? People will know what they themselves look like naked?

Someone looking at Grok's output learns literally nothing about what the actual person looks like naked, right?

Kinda sounds like somebody should just make something that creates this for every picture ever? Then everybody has a defense -- "fake nudes!" and the pictures are meaningless?

Amezarak|1 month ago

This is the sort of thing that is technically correct, but misses the emotional aspect that people want to be able to control their public perception. Of course people could (and did) do this with older tools or by hand. It doesn't matter to them. And since Elon/X are the villain du jour, it's a good lever to punish them.

wasabi991011|1 month ago

> This is such a weird issue for me, who is blind.

I'm not sure what your mental model is for someone's visual likeness.

I'd propose a blind-inclusive analogy of what is happening on Twitter is anyone can create a realistic sexdoll with the same face and body proportions as any user online.

Doesn't that feel gross, even if the sexdoll's genitalia wouldn't match the real person's?

bryanrasmussen|1 month ago

>If I have like ... a mole somewhere under my clothes, Grok cannot know about that right?

unless some ex spoke about that gross mole you had in twitter or some data that was scraped somewhere, no.

Not sure what the actual odds are of it knowing if you have a mole or not.

emilfihlman|1 month ago

Yeah the reason Google and OpenAI etc are silent is because their services do the same but they "aren't the bad guys" so if they shut up the crisis will pass.

This of course implies that the crisis itself and persecution of Musk/Grok is politically motivated, or just based on stupidity.

mirabilis|1 month ago

The same capabilities might be present in many available models, but I do think that the public/social aspect in usage is quite different— people can’t come into my Google account and save nudified versions of my family photos directly to my Google drive, but X generated a lot of attention because the users are directly replying or quoting other users and @ing them with the modified photos.