top | item 46553946

(no title)

episteme | 1 month ago

I think I hold a similar view to you and have the same question so maybe what I’ve been thinking about might be useful to you. Everyone is upset about CSAM but when you talk about it, it’s only about deepfakes.

I don’t think we can avoid a world where people can generate CSAM easily, so we have to separate the discussion between being able to do that privately and grok being able to do it.

It makes sense to me that we don’t want widely used websites to contain images of CSAM that you can’t easily avoid, it’s simply repulsive to almost everyone and that’s almost certainly a human instinct, I don’t think it needs to be much more complicated than that.

In terms of generating CSAM privately or even sharing it with other people, I think this is a much more interesting discussion. I think at this point it is an open question on whether it is harmful. Could this replace the abuse that is happening to create some of the real content? Does the escalation argument hold water - will people be more likely to sexually assault children due to access to this material? I don’t think we know enough about pedophilia to answer these questions but given that I don’t think there is a way to stop generating this content in 2026 we really need to answer these questions before we decide to simply incarcerate everyone doing it.

discuss

order

No comments yet.