top | item 46888482

(no title)

jmcgough | 25 days ago

Having an issue with users uploading CSAM (a problem for every platform) is very different from giving them a tool to quickly and easily generate CSAM, with apparently little-to-no effort to prevent this from happening.

discuss

order

timmg|25 days ago

If the tool generates it automatically or spuriously, then yes. But if it is the users asking it to, then I'm not sure there is a big difference.

dragonwriter|25 days ago

Well, its worth noting that with the nonconsensual porn, child and otherwise, it was generating X would often rapidly punish the user that posted the prompt, but leave the grok-generated content up. It wasn't an issue of not having control, it was an issue of how the control was used.

array_key_first|21 days ago

The most obvious difference is the amount of CSAM, which does matter. It might matter most.