top | item 30744117

Is macOS Look Up Destined for CSAM?

126 points| ingve | 4 years ago |eclecticlight.co | reply

133 comments

order
[+] nbzso|4 years ago|reply
I am using Little Snitch since version 2. Practically all my connections form MacOS to the mothership are stopped by default. I turn off the filter only for updates. I avoid in general Apple software (Preview, Photos, etc.). Macs are used only for work. My personal information or browsing is done only on Linux with Open Snitch and minimalistic install. In the new information landscape blind trust can be harmful.
[+] LinuxBender|4 years ago|reply
One small suggestion, consider using tcpdump on your router to make sure LS is really stopping everything. Apple and Microsoft wised up to this some time ago and some things hook in the network stack after the application firewalls. If you spot something please report it to the Little Snitch developers.

The amount of data captured by tcpdump can be minimized by only capturing syn/fin/rst packets assuming proto 6 tcp.

  'tcp[tcpflags] & (tcp-syn|tcp-fin|tcp-rst) != 0'
And if capturing to a file one can also limit the total number of packets captured with

  -c 1000
This may be useful if you are also capturing proto 17 udp.
[+] user3939382|4 years ago|reply
I couldn’t find a way to black hole CIDR blocks on Little Snitch which is necessary to completely silence macOS. However if you do this at the router you break stuff like iMessage on phones running on the wifi and so forth.
[+] KarlKemp|4 years ago|reply
No, this vaguely related technology has nothing to do with whatever you are associating with it. If Apple wants to surreptitiously spy on your porn collection, they will do so, and won’t need cover.
[+] nyanpasu64|4 years ago|reply
Does macOS 12.3 and beyond phone home with details of images and documents you open in Preview?
[+] hedgehog|4 years ago|reply
As far as I know it's purely local search. I'm guessing this is part of the development arc towards AR applications but in the near term solves the problem of being able to search Photos for "birthday party" and hopefully get something sensible out.
[+] dev_tty01|4 years ago|reply
No. Why would you think that? It goes against everything they have stated and the designs of the software. They are heavily focused on keeping all of that on device.
[+] User23|4 years ago|reply
What's the right number of lives to destroy over false positives from an algorithm? Is it some number other than zero? Why or why not?
[+] lamontcg|4 years ago|reply
I'm more concerned with the fact that framing someone for real CSAM is nearly the perfect crime.

Contract killers have to be up close and personal in meatspace and leave an actual homicide investigation in their wake.

Someone half way around the world could be contracted to spearfish you, take over your phone/laptop then cause it to download actual CSAM.

The first thing you'd know about it is when you were arrested, and then I don't think there's any kind of defense. If the hackers were very sloppy you might be able to hire a very talent forensics team to find traces of the attack. But beyond that you're pretty screwed and nobody will ever believe you.

It is a button that someone (without any morals and willing to take a certain degree of risk) can push to ruin your life remotely, leaving virtually no trace, and allowing virtually no defense. Nobody is ever going to believe a pedophile claiming that they got hacked.

And the intersection of this with politics and national intelligence agencies (who certainly have the skill and lack the morals) is probably a bit disconcerting in the larger stage.

[+] amelius|4 years ago|reply
You can ask the same question about self-driving cars.
[+] cmckn|4 years ago|reply
What’s the right number of lives to destroy from CSAM? There’s a middle ground between doing nothing and totalitarianism.

“Destroyed lives” from false positives are at this point hypothetical. Child abuse is not. It’s fair to be concerned about false positives and ensure the system handles such failures appropriately. It’s also fair to directly intervene in the widespread circulation of CSAM.

[+] kayodelycaon|4 years ago|reply
Okay… iOS has had on device image recognition since 2016.
[+] lilyball|4 years ago|reply
Recognizing "this is a cat" and "this is a specific painting of a cat" are different challenges though.
[+] noasaservice|4 years ago|reply
I'm fine with child (prepubescent) rapists whom are adults to be sentenced to death. If you were an accomplice to that, as a videographer or similar, I'm also OK with death.

(There's a really weird social area from 13-18, with the weirdness and illegality going away up at 18. Stuff in this realm, especially around 2 similar ages, gets very stupid. This is where you can get 2 16yo's sexting and being charged with CSAM of their own body. I'm avoiding this in this post.)

But what does this CSAM scanner do? It only catches already-produced pictures of CSAM. In other words, it's evidence of said crime. In no other area of criminal law is there a law against said evidence itself. And yes, given the statutory nature of these images (possession is criminal, even if you didnt' put them there), I'm not at all comfortable in charging people for simple possession.

Even if they have urges of liking age-inappropriate pornography and "CSAM", as long as they're not doing any physical actions of harming humans, I'd much rather them do so in their own bedroom alone.

Nor do I buy into the gateway theory that CSAM leads to production of CSAM by raping children. This smacks to me of the DARE drug propaganda and gateway theory (which is complete bullshit).

And, we also already have harder situations that have been deemed legal: SCOTUS stated that Japanese Manga Hentai featuring schoolgirls (obviously under 18, sometimes by quite a lot), are completely and 100% legal. Again, SCOTUS foscused on 1fa and the fact that no children were harmed in its production.

And that leads to what just happened a few days ago. With the Zelenskyy (badly done) deepfake, when can we expect 18yr women with very petite bodies, being deepfaked into 10-13 year olds? In those cases, we could attest that everyone in the production is of legal age and provided ongoing consent. Will this fall under the same as Hentai?

Tl;Dr: I'm for the criminal legalization of CSAM. I'm for death penalty for child rapists/child sexual assault. But this, I can see going very very bad, in easily overscoping CSAM to the "cause of the day".

[+] CharlesW|4 years ago|reply
> But what does this CSAM scanner do? It only catches already-produced pictures of CSAM.

You say this as if it's bad to identify people who are distributing or collecting known child pornography. Are you recommending that companies implement technologies which go beyond this by not depending on a corpus of existing materials?

[+] KennyBlanken|4 years ago|reply
> I'm fine with child (prepubescent) rapists whom are adults to be sentenced to death. If you were an accomplice to that, as a videographer or similar, I'm also OK with death.

The reliability of the criminal justice system, particularly in the US, is abhorrent. There's a long history of false convictions, particularly affecting people in minority outgroups and mentally disabled; we've executed adults who were so mentally incapacitated they were below a 10 year old in terms of mental capacity. The death penalty is highly immoral.

There are tens of thousands of black men still in jail because they were basically the most convenient way for a police department to "solve" the murder or rape of a white woman and help their case clearance rates. Police, prosecutors, and forensic "experts" were complicit. "Hair analysis" is just one example of the pseudo-science nonsense.

In Boston, a forensic chemist falsified thousands of test results and somehow this escaped notice despite her having a productivity level that was far and above virtually any other forensic chemist.

Or, if you're not exceedingly gullible: her supervisors obviously knew what she was doing and didn't care, because she made their lab look great and prosecutors got lots of open-and-shut cases.

[+] selimthegrim|4 years ago|reply
Are you a fan of giving said rapists the incentive to murder too, making prosecuting them that much harder?
[+] symlinkk|4 years ago|reply
Rambling post, hard to follow or understand
[+] WinterMount223|4 years ago|reply
What’s with the obsession with CP? I agree that it’s morally wrong and should be penalized but why is it perceived as the Ultimate Crime? Why is this tool supposedly for detecting CP only and not stolen bikes for sale, bullying through SMS, etc which are also criminal offenses?
[+] notRobot|4 years ago|reply
> why is it perceived as the Ultimate Crime?

Because you can apparently justify any move, no matter how authoritarian, by saying "think of the kids"!

It's politicians and governments exploitating psychology to get away with problematic crap.

It's not the ultimate crime, it's the ultimate justification.

[+] novok|4 years ago|reply
Because it's a good political tool that leverages parental and other human instincts to protect children. Because it puts most people in such a thought terminating blind panic you shut down thought and use it as cover for your true intentions, and give token enforcement funding for it while you direct the majority of enforcement funding for your true goals to politically control your enemies. It's old as politics itself.

It's been known for a while that this is a political technique. It is one of the four horsemen of the infoacopolypse [0] since 1988 after all. Or that “How would you like this wrapped?” comic by John Jonik in the year 2000 [1]. It's the next round of the crypto wars.

[0] https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...

[1] https://www.reddit.com/r/PropagandaPosters/comments/5re9s1/h...

[+] smeeth|4 years ago|reply
Two thoughts.

1) Good, simple politics. Protecting kids from predators is about as cut and dry an issue as you will ever find. Harry Potter vs Voldemort might be a more complicated moral issue.

2) I suspect that a few very well connected activists in the Bay Area have made it their life's work to get CSAM tools on sites.

Ashton Kutcher and his organization Thorn [0] are probably the best example of this. Thorn is an interesting example because it has been VERY good at making its case in the non-tech media e.g. [1], [2], [3] and in front of congress [4]. It should be said, Thorn makes technology that helps track down child exploitation and has had some great results, which deserve plaudits.

[0] https://en.wikipedia.org/wiki/Thorn_(organization)

[1] https://www.npr.org/sections/goatsandsoda/2019/04/15/7126530...

[2] https://www.nytimes.com/2021/09/02/opinion/sway-kara-swisher...

[3] https://www.washingtonpost.com/news/reliable-source/wp/2017/...

[4] https://www.youtube.com/watch?v=HsgAq72bAoU

[+] KennyBlanken|4 years ago|reply
> What’s with the obsession with CP?

It's been the go-to outrage generator for federal law enforcement and spy agencies to use to attack strong device and end-to-end encryption by means of legislation that requires backdoors our outlaws encryption that is too strong.

To see why, scroll down to see the guy advocating for the death penalty for people involved in child porn production.

If only law enforcement showed equal vigor for addressing child abuse in religion, whether it's raping altar boys or using the mouth to clean blood off a baby that has been circumcised (often causing syphilis outbreaks in the process.)

It's almost like it's not actually about fighting child abuse, but about being able to snoop in your devices and communications.

[+] oh_sigh|4 years ago|reply
CP is fairly easy to recognize if you see it I'd imagine. I'm sure there are some instances where an adult just looks very young, but there is probably a lot of CP out there with no potential for that.

How exactly does one recognize a bike as stolen from a photograph?

[+] unsupp0rted|4 years ago|reply
Well it is a special category of human depravity. In prison the other prisoners don't go out of their way to beat and shank the bike thieves and cyber bullies, or even the run-of-the-mill murderers.
[+] Mikeb85|4 years ago|reply
> I agree that it’s morally wrong and should be penalized but why is it perceived as the Ultimate Crime?

Because children are trafficked and abused to create it SMH...

[+] buildbuildbuild|4 years ago|reply
How does someone truly test how this feature is being used without possessing illegal content? This is a nearly-impossible area to research. Frightening.

(edit: I'm of course referring to possessing anti-Putin memes) (sarcasm)

[+] perihelions|4 years ago|reply
Obviously, definitionally, it's impossible to verify that server-side logic isn't doing something evil. (Local homeomorphic protocols count, when the secret logic is imported from remote servers).

This is one reason FOSS is actually-important and actually-relevant. Isn't it valid to know exactly what your personal computer is doing, to be able to trust your own possessions? Richard Stallman was *never* crazy; his understanding of these issues is so cynical as to be shrill and off-putting, but that's well-calibrated to the severity of the issues at stake.

You joke about anti-Putin memes. Here's a thought for well-calibrated cynics: Apple solemnly swears its hashes are attested by at least two independent countries. Russia and Belarus are two independent countries.

[+] version_five|4 years ago|reply
By test it, do you mean see if the police show up at your door? If you know how it works, you just need a list of hashes and a way to find a collision which I believe exists.

Otherwise, you're really just highlighting the problem with all closed source software, you don't really have a way to check what it does so you have to trust the vendor.

[+] mike_d|4 years ago|reply
Research has to be done in partnership with the NCMEC, which in turn partners with the Department of Justice to run the database of known CSAM material.
[+] chockchocschoir|4 years ago|reply
You don't have to test it against anti-Putin memes to see if it would work for anti-Putin memes. Algoritm would be something like:

1. Have image

2. Get hash of image

3. Get another hash from another similar image

4. Compare hashes

The images themselves can be of whatever to see if it works as expected, they don't have to contain anti-Putin memes.