Having an issue with users uploading CSAM (a problem for every platform) is very different from giving them a tool to quickly and easily generate CSAM, with apparently little-to-no effort to prevent this from happening.
Well, its worth noting that with the nonconsensual porn, child and otherwise, it was generating X would often rapidly punish the user that posted the prompt, but leave the grok-generated content up. It wasn't an issue of not having control, it was an issue of how the control was used.
timmg|25 days ago
dragonwriter|25 days ago
array_key_first|21 days ago