> How about you come back when your daughter has a fake AI nude passed around school.
Like any bad behaviour, the grown-up response should be discipline and education.
There's a million ways kids can misbehave. The idea is to get kids ready for the real world, not pretend there's nothing bad out there.
Obviously we don't want "point and click" AI nudes in the hands of minors, or kids having their own AI accounts in the first place. Parents and educators pay for their kid's devices and internet connections. If they're not being responsible, you take away the privilege until they learn about respectful behaviour.
If the kid is allowed to stay out after dark but ends up doing crime at those times, we don't ask the government to impose a curfew on every kid. We discipline the kids involved. And that's my last comment in this thread thank God, what a struggle.
Not the same - the barrier to entry was too high. Most people don't have the skills to edit photos using Photoshop. Grok enabled this to happen to scale for users who are complete non techies. With grok, anyone who could type in a half-coherent sentence in English could generate and disseminate these images.
It's not hypothetical. And in fact the girl who was being targeted was expelled not the boys who did it [1].
Those boys absolutely should be held accountable. But I also don't think that Grok should be able to quickly and easily generate fake revenge porn for minors.
Punishing kids after the fact does not stop the damage from occurring. Nothing can stop the damage that has already occurred, but if you stop the source of the nudes, you can stop future damage from occurring to even more girls.
They may well get in trouble, but in that takes time, in the meantime photos will have been seen by most kids in school + you might get a year of bullying.
Education might be so disrupted you have to change schools.
But they are getting in trouble. However, for every one that gets in trouble, there's more that don't get discovered, or that don't get in trouble for it.
Besides, getting in trouble for something is already after the fact, the damage has been done. If it can't be done in the first place, or the barrier is too high for most, then the damage would have been prevented.
children do dumb things and make mistakes all the time, teenagers push the boundaries as far as they can (and they have a role model in the white house now)
We fault and "fine" companies for providing products that harm society all the time
Are you not going to consider the company providing a CSAM machine to be the major one at fault here?
I really find this kind of appeal quite odious. God forbid that we expect fathers to have empathy for their sons, sisters, brothers, spouses, mothers, fathers, uncles, aunts, etc. or dare we hope that they might have empathy for friends or even strangers? It's like an appeal to hypocrisy or something. Sure, I know such people exist but it feels like throwing so many people under the bus just to (probably fail) to convince someone of something by appealing to an emotional overprotectiveness of fathers to daughters.
You should want to protect all of the people in your life from such a thing or nobody.
dang|25 days ago
https://news.ycombinator.com/newsguidelines.html
exodust|25 days ago
Like any bad behaviour, the grown-up response should be discipline and education.
There's a million ways kids can misbehave. The idea is to get kids ready for the real world, not pretend there's nothing bad out there.
Obviously we don't want "point and click" AI nudes in the hands of minors, or kids having their own AI accounts in the first place. Parents and educators pay for their kid's devices and internet connections. If they're not being responsible, you take away the privilege until they learn about respectful behaviour.
If the kid is allowed to stay out after dark but ends up doing crime at those times, we don't ask the government to impose a curfew on every kid. We discipline the kids involved. And that's my last comment in this thread thank God, what a struggle.
ljsprague|25 days ago
[deleted]
wtcactus|25 days ago
[deleted]
sam-cop-vimes|25 days ago
Edit: clarified the last sentence
joe_mamba|25 days ago
Have we a outsourced all accountability for the crimes of humans to AI now?
ImPleadThe5th|25 days ago
Those boys absolutely should be held accountable. But I also don't think that Grok should be able to quickly and easily generate fake revenge porn for minors.
[1] https://www.nbcnewyork.com/news/national-international/girl-...
anonymous908213|25 days ago
stuaxo|25 days ago
Education might be so disrupted you have to change schools.
Cthulhu_|25 days ago
Besides, getting in trouble for something is already after the fact, the damage has been done. If it can't be done in the first place, or the barrier is too high for most, then the damage would have been prevented.
But this is a recurring dilemma.
verdverm|25 days ago
We fault and "fine" companies for providing products that harm society all the time
Are you not going to consider the company providing a CSAM machine to be the major one at fault here?
BigTTYGothGF|25 days ago
"Boys will be boys", and so on. (https://en.wikipedia.org/wiki/Rape_culture)
saubeidl|25 days ago
The crime is creating a system that lets schoolboys create fake nudes of other minors.
You don't just get to build a CSAM-generator and then be like "well I never intended for it to be used...".
The humans running a company are liable for the product that their company builds, easy as that.
BlackFly|25 days ago
You should want to protect all of the people in your life from such a thing or nobody.