top | item 46471638

(no title)

zajio1am | 1 month ago

When these models are fine tuned to allow any kind of nudity, i would guess they also can be used to generate nude images of children. There is a level of generalization in these models. So it seems to me that arguing for restrictions that could be effectively implemented via prompt validation only is just indirect argumentation against open-weight models.

discuss

order

chrisjj|1 month ago

> When these models are fine tuned to allow any kind of nudity

If you're suggesting Grok is fine-tuned to allow any kind of nudity, some evidence would be in order.

The article suggests otherwise: "The service prohibits pornography involving real people’s likenesses and sexual content involving minors, which is illegal to create or distribute."