(no title)
wegs | 4 years ago
- Apple has over a billion devices out there.
- Child abuse is a rare problem, but with over a billion devices, there will be enough of it for a lot of newsworthy stories.
- Child pornography takes just one abused child for an arbitrary number of viewers. Arguably, by the time you're limiting the number of viewers, most of the harm has been done.
On the whole, I'm not quite sure how the Apple plan will protect actual children from rape (except to somewhat reduce the secondary harm of distribution). I can clearly see how it will protect Apple from bad press, though -- people won't use iPhones to record that.
On the other hand, an investment in education, health care, reporting, and enforcement could significantly reduce the amount of child abuse, but with 7 billion people in the world, no expense would bring it to zero. So long as it's not zero, the potential for bad press is there. Indeed, usually if something happens a few times per year, it receives more bad press than if it happens a few times per day.
Apple has every incentive to be (1) seen as doing something (2) do things which protect its brand value. Apple has no incentive to invest in education, health care, reporting, and enforcement. Those seem like good things to do, but if anything, if a scandal comes up, those sorts of things are used to say "See, Apple new, and was trying to buy an out."
As a footnote, if we value all children equally, a lot of this is super-cheap. This is a good movie:
https://en.wikipedia.org/wiki/Born_into_Brothels
And the problem it portrays could probably be solved with the same finances as the salaries of a few Apple engineers, and a focused, targeted effort to identify child prostitutes, help their families with the economics which force those kids to become child prostitutes, and get those kids into schools instead.
I'm guessing the $100k raised from this film will do more to protect kids than this whole Apple initiative will do.
themaninthedark|4 years ago
We would not accept having breathalyzers in every car.
Or to bring it closer to the child abuse problem: Would we accept cameras that take pictures of the occupants of the car to make sure that the minors in the care are not being trafficked?
frickinLasers|4 years ago
lol, that's not up to us. It's in the infrastructure bill.
https://www.mediaite.com/news/infrastructure-bill-could-requ...
HWR_14|4 years ago
Funny you would bring that up. I think the new infrastructure bill requires that for cars built after 2029 (or some other "future, but not that far" date)
dkonofalski|4 years ago
throwavocado|4 years ago
You bring up the distinction between "possession offenses" (i.e., a person who has CSAM content) and "hands-on offenses" (i.e., a person who abuses children and possibly, but not necessarily, produces CSAM). Detecting possession offenses (as Apple's sytem does) has the second-order effect of finding hands-on offenders because hands-on offenders tend to also collect CSAM and form large libraries of it. So finding a CSAM collection is the best way to find a hands-on offender and stop their abuse. Ideally, victims would always disclose their abuse so that the traditional investigatory process could handle it -- but child sexual abuse is special in that offenders are skilled in manipulating children and families in order to avoid detection.
I think that the case of USA v. Rosenchein [0] is a good example because it shows the ins and outs of how the company->NCMEC->law enforcement system tends to work and how it leads to hands-on offenders. It's higher profile than most, perhaps because the defendant (a surgeon), seems to have plenty of resources for fighting the conviction on constitutional grounds (as opposed to actually claiming innocence). But the mechanism leading to the prosecution is by no means exceptional.
Caveat: Not a lawyer.
[0] https://www.anylaw.com/case/usa-v-rosenchein/d-new-mexico/11...
wegs|4 years ago
It's not all hard to find such places. Many children are abused at scale, globally. I think few of those kids are getting filmed or turned in CSAM.
I'm also not at all sold on your claim that hands-on offenders tend to collect CSAM materials either, but we have no way to know.
I am sold on the best way of reducing actual abuse involves some combination of measures such as:
1) Fighting poverty; a huge amount of exploitation is for simple economic reasons; people need to eat
2) Providing social supports, where kids know what's not okay, and have trusted individuals they can report it to
3) Effective enforcement everywhere (not just rich countries)
4) Places for such kids to escape to, which are safe and decent. Kids won't report if the alternative is worse
... and so on. In other words, building out a basic social net for everyone.
browningstreet|4 years ago
We are citizens of our country and we deserve a dignified existence. We are supposed to have rights, and they're being worn away, formally and informally, by our governments and megacorps acting like NGOs.
I'm sympathetic to the overwhelming horrors of drunks, drunk driving, violent actors, child abuse, child porn, economic crimes, etc.
I've done my calculus, and I got my vaccine and I wear my mask in the current circumstances of our pandemic. But in a similar calculus, what Apple has planned to subject a huge portion of our population to, by din of their marketshare in mobile and messaging. I personally can't accept the forces at play in this Apple decision, and I'm continually baffled by those who think this is overblown.
ummonk|4 years ago
This is the thing that privacy advocates seem to ignore. Measures taken to reduce child abuse won’t reduce the circulation of whatever CASM does get created.
Some even seem to think, a la the ACLU, that viewing child abuse material is a victimless crime, and only the creators of the CASM should be punished.