I agree with you, but in some jurisdictions the distance between stuff generated with AI and actual photographs of child abuse are treated rather closely; either way, possessing either could result in what the England & Wales calls a "sexual harm prevention order" (SHPO). To me the idea that someone could be served such an order without ever possessing real CSEM (or "child porn"), never mind actually never being near a child is rather worrying.
No comments yet.