I have a newborn at home, and like every other parent, we take thousands of pictures and videos of our newest family member. We took pictures of the very first baby-bath. So now I have pictures of a naked baby on my phone. Does that mean that pictures of my newborn baby will be uploaded to Apple for further analysis, potentially stored for indefinite time, shared with law enforcement?
Lots of people responding to this seem to not understand how perceptual hashing / PhotoDNA works. It's true that they're not cryptographic hashes, but the false positive rate is vanishingly small. Apple claims it's 1 in a trillion [1], but suppose that you don't believe them. Google and Facebook and Microsoft are all using PhotoDNA (or equivalent perceptual hashing schemes) right now. Have you heard of some massive issue with false positives?
The fact of the matter is that unless you possess a photo that exists in the NCMEC database, your photos simply will not be flagged to Apple. Photos of your own kids won't trigger it, nude photos of adults won't trigger it; only photos of already known CSAM content will trigger (and that too, Apple requires a specific threshold of matches before a report is triggered).
No. The CSAM (Child Sexual Abuse Material) scanning is comparing hashes of photos about to be uploaded to iCloud against a specific set of images at NCMEC (National Center for Missing and Exploited Children) which are specific to missing and exploited children. It is not machine learning models looking for nudes or similar. It is not a generalized screening. If enough matched images are found, the images are flagged for manual verification. If the manual verification confirms that the images match specific images in the NCMEC database, law enforcement is informed.
Be aware that almost all cloud providers screen photos. Facebook reported 20 million images in 2020, Google reported half a million. Dropbox, Box, and many, many others report images. See https://www.missingkids.org/content/dam/missingkids/gethelp/... to see a complete list of companies that screen and report images.
The other thing Apple announced which is completely separate from the CSAM photo scanning is additional parental controls for the Messages app. If a parent opts in for their under-13 children, a machine learning model will look for inappropriate material and warn the child prior to showing the image. The child is also told that their parent will be flagged if the child looks at it anyway. For 13-18 year olds whose parents opted in, the teen is warned first about the content. If the teen continues past the warning the image is shown and no further action is taken. Parents are not flagged for children 13 and over. As I said, this is a parental control for pre-adult kids. It requires opt-in from the parents and has no law enforcement implications.
Yes, if they wind up part of a child porn investigation. Your cloud account gets hacked. Some perv gets your images. He is then arrested and his "collection" added to the hash database... including your family photos.
Context often matters more than the nature of the actual content. Police aquire thousands of images with little hope of ever knowing where they originated. If they are collected by pervs, and could be construed as illegal in the hands of pervs, the images become child porn and can be added to the databases.
If you don't choose upload to icloud, no upload to apple at all.
If you do choose icloud upload (most do), they were being uploaded already and stored and may be available to law enforcement.
If you do upload to icloud, NOW they will be screened for matches with "known" images in a database, and if you have more than a threshold number of hits, you may be reported. This will happen on device.
Apple will also scan photos in their cloud system as well from what I can tell (though once on device is working less should land in cloud).
Note that it is HIGHLY likely that google photos / facebook / instagram and others will or are already doing similar scanning and reporting. I've heard millions of reports go in a year.
no, because it only catches registered CSAM, however if you sent your pictures to relatives etc. and somehow someone who was into CSAM got one of your pictures and later gets arrested, your photo in their collection could theoretically be registered in the official archives - then you might have something that matches one of the hashes of a known CSAM image in your collection of images (maybe enough matches to have the police come talk to you)
on edit: later on of course this will make a great article in some place like the Atlantic with a stolid monochromatic picture of your family in the lead-in and we will all read about it on HN and talk about how this was an obvious problem with the whole system (if it gets posted at the right time and gets enough upvotes).
> The worst part is: how do I put my money where my mouth is? Am I going back to using Linux on the desktop (2022 will be the year of Linux on the desktop, remember), debugging wifi drivers and tirelessly trying to make resume-from-suspend work?
Oh come on. DOn't make it sound like it's that bad. Wifi is a solved problem for a long time now, and you can buy Lenovo, System76 or Tuxedo if you want to make sure 100% things work as expected. Don't be that guy.
I don't know why, but Ubuntu still manages to make installing updates a 50/50 chance of breaking the NVIDIA GPU drivers, which then means I have to reinstall them in 800x600 where the "Software Updater" window doesn't fit on screen anymore.
Also, getting full USB3 support on Ubuntu is still a struggle. On Windows and Mac, the same USB camera "just works". On Linux, I need to learn how to download the kernel sources, checkout the correct branch, and recompile uvcvideo with different URB parameters, or else I get random disconnects.
And of course, "apt-get source" will produce the source code for the 4.x kernel that Ubuntu 18 had when I installed it, but they since upgraded it to 5.x so "apt source" is now utterly useless.
You can indeed tell how tired and old the complaint is by the nature of what it complains about.
In 2021 we are instead lumbered with inconsistent support for hidpi displays, lack of DTMF in linphone, and Evolution’s option to disable pc beep on new message being a plugin.
But that was just the last week. Next week will be better and the fight for freedom is indeed an eternal struggle.
Second this. I use Ubuntu Mate on a RPi4 for a couple of weeks now. All went fine. Last week, I suddenly thought: I should connect my printer as well, and expected to have a slightly harder time, just like setting up my printer on my last Ubuntu pc.
Click-click-done. I didn't have a hard time, not even with connecting my printer. I'm almost disappointed a bit, since there's no way I'm a cool computer guy if it's this easy.
> The worst part is: how do I put my money where my mouth is? ..., debugging wifi drivers and tirelessly trying to make resume-from-suspend work?
Coincidentally, this is actually a good idea. Apart from using supported hardware (that others have checked actually works), contributing fixes for hardware that's not officially supported yet and hasn't been tested would benefit everyone in the future!
I remember having to dig through GitHub to find a repository that had the network drivers for my off-brand Chinese/Polish netbook (i'm somewhat poor and/or frugal) and they actually worked and turned a system that would otherwise not have any network connectivity into my daily driver for note taking. Now, the fact that i couldn't automate this lookup process and that there's nothing out there that lets you check for these drivers more easily (think something along the lines of https://appdb.winehq.org/ but for drivers) or maybe try multiple ones in a row, was disappointing because things felt needlessly hard. However, actually contributing or using the work of others isn't that much of a problem.
And, since the whole ecosystem is pretty much open, there's nothing actually keeping one from at least trying to address these problems for their particular configuration, apart from needing to learn how to do so. In a sense, working on open source is exactly putting your money where your mouth is, even if it's just alternative costs.
+1. Linux support is pretty good these days... unless you're trying to run it on a Surface Pro or something. I'm just worried that I won't be able to find any alternative to the iPhone. The Librem and Pine phones look promising... it's just not clear how long it will be till they're stable enough to use as daily drivers.
I recently spent a full day trying to debug wifi on my dual-boot desktop with "newish" hardware. The wifi would occasionally drop 100% of packets, for between 5 seconds and 10 minutes, and then go back to normal. I tried installing different drivers and following some askubuntu forum posts, but nothing seemed to work.
I'm not an expert on hardware stuff so I haven't got the knowledge to dig deep and find what caused it. But on Windows this stuff "just works". My impression is that Windows has "solved" wifi.
In the end I just bit the bullet and connected the ethernet cable...
It's not bad at all, in fact: it's great!
I just switched from an M1 mac and an Intel MacBook Pro to a Manjaro desktop with Gnome 4. In several ways, I found the Gnome 4 user experience to be better than macOS Big Sur. Even 3 finger swipe gestures with the Apple Trackpad work fine to switch between desktops. And my PS5 DualSense controller? Even the touchpad works, in a Wine/Windows application under Linux of all things!
With proper cooling, the machine is near-quiet on light loads like browsing. The background noise in my house is generally higher than the idle fan noise.
It's obviously noisier with higher loads, but that's what you get with a beefy graphics card. (the CPU cooler has a 24 db upper limit).
I also had no issues with BlueTooth or AX WiFi. Resume-after-suspend works solidly too. The only hickup I had was that my graphics card is too new (Radeon 6700 XT) and that I had to get a newer Manjaro ISO from GitHub, rather than the main website.
You ommitted resume from suspend which is still a frigging problem. At least some finally support UEFI so your usb key install doesnt freeze on the first screen.
But yeah wifi and nvidia are solved - just a black screen and single mode startup the first time to deactivate the opensource drivers for some obscure reason (I understand they prefer it but why does it crashes... might as well just put the nvidia ones directly)
A year and a half ago, we tried to convert a well-spec'd Dell laptop from Windows to Ubuntu for a new dev. We couldn't get the wifi to work. The dev now uses a MacBook.
Whoever controls the hash list controls your phone from now on. Period. End of sentence.
Apple has not disclosed who gets to add new hashes to the list of CSAM hashes or what the process is to add new hashes. Do different countries have different hash lists?
Because if the FBI or CIA or CCCP or KSA wants to arrest you, all they need to do is inject the hash of one of your photos into the “list” and you will be flagged.
Based on the nature of the hash, they can’t even tell you which photo is the one that triggered the hash. Instead, they get to arrest you, make an entire copy of your phone, etc.
It’s insidious. And it’s stupid. Why Apple is agreeing to do this is disgusting.
And it doesn’t make sense. If I were a pedophile and I took a new CSAM photo, how long would it take for that specific photo to get on the list? Months? Years? As long as pedophiles know that their phones are being scanned, they won’t use iPhones for their photos. And then it will be only innocent people like me that get scanned for CSAM and potentially getting that used against me in the future.
If they really cared about CSAM, this feature is useless and stupid. All it does is make regular people vulnerable to Big Brother tactics which we know already exist.
The short answer is that at some point an Apple employee must visually review the flagged photo, and confirm that it does represent CSAM content. If it does not, then Apple is under no legal obligation to report it.
Third: You claim that abusers will simply opt not to use iPhones to distribute their CSAM content rendering the feature useless. This is in fact not how things have played out on other platforms like Google and Facebook that do already scan for CSAM. These organizations report on the order of millions of flagged images per year. [1] Clearly the abusers have simply not moved on to a different platform.
What on earth? To clear up some of your false statements:
- You need several hash matches to trigger a review
- The reviewer can of course see what triggered the review (the visual derivative)
- The reviewer would see that the matches are not CSAM, and instead of the report being sent on to the NCMEC it would instead start an investigation of why these innocuous images were matched in the first place
- If the CIA or FBI or CCP wanted to arrest you, there are much easier ways than this
Apple basically controls you phone anyway and have done for years as they can issue patches and os updates.
Also you can turn iphotos off - I've never used the thing in spite of owning various apple devices. I do use Google photos and doubt they are much different in terms of checking for CSAM.
I imagine this very well being extended not just to match photos, but also metadata within the photos. And then the list of things it matches against is extended from CP to other undesirables by society.
After a 12 hour flight - that was of course delayed - Liam was pretty exhausted, but was looking forward to getting to his hotel in the center of Munich. He got to the front of the queue, and handed his passport over to the customs officer. The officer scanned Liam's passport, took it off the reader, and after 20 seconds asked "You flew from Los Angeles today?". Liam replied "Yes...". The customs officer, with his firm German accent, said "I need to check something with my colleague, wait here.". Not that there was anywhere Liam would go.
The customs officer came back with someone else who was slightly older and clearly more senior. The senior officer said "Come with me please", and led Liam to a room at the side of the customs hall. The officer said "Sit down please", indicating to the chair in front of the desk. The room looked like any other office, with a computer on a desk and chairs either side. The officer sat behind the desk and started typing something on the computer. After a few minutes he said "You are wanted by Interpol".
The customs officer explained to Liam that he had been flagged as a photo he had taken 6 months before included a known terrorist, and so by association Liam had been flagged. Liam asked how they accessed his photos - he is tech savy and only takes encrypted backups onto his own devices. The customs officer explained that they didn't need to, as this flagging had been done entirely by his device. The customs officer gave the date of the photo, and Liam found it on his device. He had been on holiday with his girlfriend in Paris, and they had taken a selfie. There was someone clearly visible behind them, and the customs officer explained that the facial recognition had identified this person. Due to privacy laws he wasn't able to say (or even see themselves) who this person was, only that they were on the highest German terrorist watch list.
From the photo it looked quite obvious that this person was just a passer by who glanced at the couple just at the moment they were taking a photo. The customs officer took Liam's fingerprints and asked Liam questions about his trip to Paris - typing the answers into the computer - and then the computer decided that Liam could be released. However the customs officer told Liam that he would be closely monitored while in Germany, and may receive 'check in' calls from police. He told Liam he must answer them otherwise a team will be dispatched to intercept him. Liam was then allowed to go on his way. He was only delayed by 45 minutes, but it wasn't a great start to his holiday in Germany. 3 years later when he visited Germany again the same thing happened, at least he knew what to expect this time...
(This is partially based on something that actually happened to me. Nearly a decade ago my passport was stolen, and every time I go to Germany I need to have a fun conversation with customs officers. Every other country I've visited - including the US - let's me through without even mentioning it)
>The worst part is: how do I put my money where my mouth is? Am I going back to using Linux on the desktop (2022 will be the year of Linux on the desktop, remember)
people really need to retire this meme. On the desktop in particular as a dev environment Linux is completely fine at this point. I can understand people not wanting to run a custom phone OS because that really is a ton of work but for working software developers Fedora, Ubuntu whatever any mainstream distro is at this point largely hassle free.
While I don't expect any Linux phone to become "mainstream" any time soon, it would be good if we had at least one "polished" alternative available.
PinePhone is still in beta and according to its own creators "aimed solely at early adopters"[1], while Librem 5 is experiencing supply chain issues with backorder shipping now scheduled to resume in October[2]
There is a version of the Librem 5 which is made in USA and it's in stock and shipping now, but unfortunately outside of my budget[3].
I was also considering getting something like Fairphone and installing an alternative OS but looking at compatibility charts there are some things that may not work with one OS or another.
So, right now I can't have a daily driver that is not iOS or Android, I will hold onto my very old smartphone and hope that things will change in the next year or so. I'm working from home for the foreseeable future so I can wait a bit.
I hate ubuntu from the bottom of my heart, for breaking stuff and changing stuff that used to "just work" all the time, but 99.999% of the time, that means "background stuff", "normal users" never mess around with, and for normal users, a "usb key -> install -> next, next, next -> finish -> reboot" just works.
I’ve been using Ubuntu and now pop os on a Thinkpad for a couple years now, and I don’t miss Windows (which I used since…well, DOS 6) at all. Quite the opposite. As time goes on, seeing what’s happening with MacOS and Windows, I’m more and more happy that my computer is actually my computer.
They can retire the meme when any measurable fraction of mainstream users are using Linux without a proprietary software layer provided by a big tech company.
Imagine taking a photo or have in your gallery a photo a dear leader doesn't want to spread. Ten minutes later you heard a knocking at your door. That's what I'm most worried about, how is this not creating the infrastructure to ensnare political dissidents.
Unless Apple can demonstrate that the techniques they are using are intrinsically specific to CSAM and to CSAM only--the techniques do not work for any other kinds of photo or text--slippery slope arguments are perfectly valid and cannot be denied.
Apple is a private company and as such its actions amount to vigilantism.
To anyone upset or offended by the Linux/nerd paragraph: please chill, and please forgive my tone.
I am a nerd myself indeed, and what I wanted to convey by this not-as-funny-sa-expected paragraph was that "going full nerd" is not a solution. There are ways to protect your privacy that will not be available to less tech-savvy people, and it's a problem. HN crowd will use Thinkpads with Arch on them, and phones with Graphene or whatever, but most people won't.
Yours,
Absolute nerd and lover of desktop Linux since SuSE 6.0
I doubt Apple has not thought about the PR & policy consequences of such an iPhone backdoor. For me, it's even more sad to see Apple using the fight against CSAM, a noble cause, as a shield and a way to convince the masses that breaking its promise to protect privacy is OK. "What happens in your iPhone stays on your iPhone [no longer]". There is no court oversight, no laws, it's automated mass surveillance.
I used to always get the latest and greatest iphone but with the politics and everything that's going on why would I want to spend more than the absolute minimum on my cellphone? There are plenty of wholesome things to spend money on other than tech.
From A Concrete-Security Analysis of the Apple PSI Protocol:
> Taking action to limit CSAM is a laudable step. But its implementation needs some care. Naively done, it requires scanning the photos of all iCloud users. But our photos are personal, recording events, moments and people in our lives. Users expect and desire that these remain private from Apple. Reciprocally, the database of CSAM photos should not be made public or become known to the user.
Apple has found a way to detect and report CSAM offenders while respecting these privacy constraints. When the number of user photos that are in the CSAM database exceeds the threshold, the system is able to detect and report this. Yet a user photo that is not in the CSAM database remains invisible to the system, and users do not learn the contents of the CSAM database.
So Apple is going to take care of positive matches with highly reliable and trained personnel? Just like their highly trained personnel who kept the App Store clean of shitty apps? :')
Apple implemented a backdoor that scans your photos on your device, then alerts Apple and the authorities if there is a match against an un-auditable list of reference photos.
Currently it's been activated for CSAM only and only scans photos backed up to iCloud.
That's the framing I prefer and which much better explains the issue with it.
Apple is not the police. Apple is not an extension of the Unites Government. There is simply no reason for Apple to enable local scanning for any content what so ever.
My common sense is tingling, telling me that Apple's eventual move will be one of malicious compliance, finally implementing e2ee in a way that provides them with culpable deniability and users with a much desired privacy enhancement.
I turned off iCloud photos tonight. F** Apple. If there is a collision, then it gets manually reviewed by a human...so now my private pictures are on display for someone to see who I've not given permission. Just Say No.
On the internet almost everyone wants to extract money from kids and their parents and they try to hook them up with different mechanisms. That is also true for Apple, although they appeal to protective instincts of their guardians.
I get why a safe environment is appealing. Parents know that their kids get milked by virtual goods in games or social media and don't know how to protect them from that. I think states are indeed responsible to set sensible boundaries for the industry to protect minors.
But this cannot lead to subject the whole net to it. Age verification is also not possible, so a protected environment is the way to go. The latter is difficult to advertise to developers because they also know about corporate ambitions to get their hands on market share.
Google isn't even the worst actor, more aggressive corps like Amazon are far more destructive in this field, but there isn't a single corp that is guilty here, so legislation also needs to protect free spaces. While seemingly in contradiction, this is also extremely important for digital education of future generations, even more so than questionable content in my opinion. Most here might have been subjected to that as kids. Was it that bad as generally assumed? This is a threat that should not be overblown. Parents feeling guilty neglecting their kids are extremely vulnerable to this line of thinking, even if they don't neglect their kids at all.
Many countries have rules against cartels, but there is a conflict of interest here. No country likes to split their most successful companies for nothing in an international market. So nobody does.
[+] [-] querez|4 years ago|reply
[+] [-] fortenforge|4 years ago|reply
The fact of the matter is that unless you possess a photo that exists in the NCMEC database, your photos simply will not be flagged to Apple. Photos of your own kids won't trigger it, nude photos of adults won't trigger it; only photos of already known CSAM content will trigger (and that too, Apple requires a specific threshold of matches before a report is triggered).
[1] "The threshold is selected to provide an extremely low (1 in 1 trillion) probability of incorrectly flagging a given account." Page 4 of https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
[+] [-] dev_tty01|4 years ago|reply
Be aware that almost all cloud providers screen photos. Facebook reported 20 million images in 2020, Google reported half a million. Dropbox, Box, and many, many others report images. See https://www.missingkids.org/content/dam/missingkids/gethelp/... to see a complete list of companies that screen and report images.
The other thing Apple announced which is completely separate from the CSAM photo scanning is additional parental controls for the Messages app. If a parent opts in for their under-13 children, a machine learning model will look for inappropriate material and warn the child prior to showing the image. The child is also told that their parent will be flagged if the child looks at it anyway. For 13-18 year olds whose parents opted in, the teen is warned first about the content. If the teen continues past the warning the image is shown and no further action is taken. Parents are not flagged for children 13 and over. As I said, this is a parental control for pre-adult kids. It requires opt-in from the parents and has no law enforcement implications.
[+] [-] gonehome|4 years ago|reply
There are legitimate things to be concerned about, but 99% of internet discussion on this topic is junk.
[+] [-] sandworm101|4 years ago|reply
Context often matters more than the nature of the actual content. Police aquire thousands of images with little hope of ever knowing where they originated. If they are collected by pervs, and could be construed as illegal in the hands of pervs, the images become child porn and can be added to the databases.
[+] [-] slownews45|4 years ago|reply
If you do choose icloud upload (most do), they were being uploaded already and stored and may be available to law enforcement.
If you do upload to icloud, NOW they will be screened for matches with "known" images in a database, and if you have more than a threshold number of hits, you may be reported. This will happen on device.
Apple will also scan photos in their cloud system as well from what I can tell (though once on device is working less should land in cloud).
Note that it is HIGHLY likely that google photos / facebook / instagram and others will or are already doing similar scanning and reporting. I've heard millions of reports go in a year.
[+] [-] bryanrasmussen|4 years ago|reply
on edit: later on of course this will make a great article in some place like the Atlantic with a stolid monochromatic picture of your family in the lead-in and we will all read about it on HN and talk about how this was an obvious problem with the whole system (if it gets posted at the right time and gets enough upvotes).
[+] [-] ekianjo|4 years ago|reply
Oh come on. DOn't make it sound like it's that bad. Wifi is a solved problem for a long time now, and you can buy Lenovo, System76 or Tuxedo if you want to make sure 100% things work as expected. Don't be that guy.
[+] [-] fxtentacle|4 years ago|reply
Also, getting full USB3 support on Ubuntu is still a struggle. On Windows and Mac, the same USB camera "just works". On Linux, I need to learn how to download the kernel sources, checkout the correct branch, and recompile uvcvideo with different URB parameters, or else I get random disconnects.
And of course, "apt-get source" will produce the source code for the 4.x kernel that Ubuntu 18 had when I installed it, but they since upgraded it to 5.x so "apt source" is now utterly useless.
If I had to summarize my Linux experience:
"Pain only makes you stronger"
[+] [-] gorgoiler|4 years ago|reply
In 2021 we are instead lumbered with inconsistent support for hidpi displays, lack of DTMF in linphone, and Evolution’s option to disable pc beep on new message being a plugin.
But that was just the last week. Next week will be better and the fight for freedom is indeed an eternal struggle.
[+] [-] rambambram|4 years ago|reply
Click-click-done. I didn't have a hard time, not even with connecting my printer. I'm almost disappointed a bit, since there's no way I'm a cool computer guy if it's this easy.
[+] [-] KronisLV|4 years ago|reply
Coincidentally, this is actually a good idea. Apart from using supported hardware (that others have checked actually works), contributing fixes for hardware that's not officially supported yet and hasn't been tested would benefit everyone in the future!
I remember having to dig through GitHub to find a repository that had the network drivers for my off-brand Chinese/Polish netbook (i'm somewhat poor and/or frugal) and they actually worked and turned a system that would otherwise not have any network connectivity into my daily driver for note taking. Now, the fact that i couldn't automate this lookup process and that there's nothing out there that lets you check for these drivers more easily (think something along the lines of https://appdb.winehq.org/ but for drivers) or maybe try multiple ones in a row, was disappointing because things felt needlessly hard. However, actually contributing or using the work of others isn't that much of a problem.
And, since the whole ecosystem is pretty much open, there's nothing actually keeping one from at least trying to address these problems for their particular configuration, apart from needing to learn how to do so. In a sense, working on open source is exactly putting your money where your mouth is, even if it's just alternative costs.
[+] [-] mwambua|4 years ago|reply
[+] [-] catears|4 years ago|reply
I'm not an expert on hardware stuff so I haven't got the knowledge to dig deep and find what caused it. But on Windows this stuff "just works". My impression is that Windows has "solved" wifi.
In the end I just bit the bullet and connected the ethernet cable...
[+] [-] ByteWelder|4 years ago|reply
With proper cooling, the machine is near-quiet on light loads like browsing. The background noise in my house is generally higher than the idle fan noise. It's obviously noisier with higher loads, but that's what you get with a beefy graphics card. (the CPU cooler has a 24 db upper limit).
I also had no issues with BlueTooth or AX WiFi. Resume-after-suspend works solidly too. The only hickup I had was that my graphics card is too new (Radeon 6700 XT) and that I had to get a newer Manjaro ISO from GitHub, rather than the main website.
[+] [-] secondaryacct|4 years ago|reply
But yeah wifi and nvidia are solved - just a black screen and single mode startup the first time to deactivate the opensource drivers for some obscure reason (I understand they prefer it but why does it crashes... might as well just put the nvidia ones directly)
[+] [-] tgv|4 years ago|reply
[+] [-] solarkraft|4 years ago|reply
[+] [-] hanniabu|4 years ago|reply
[+] [-] chess_buster|4 years ago|reply
[+] [-] pjmlp|4 years ago|reply
It is basically the surviving device I still bother to run GNU/Linux bare metal on, and it was sold with Linux support from the get go, yet....
[+] [-] carabiner|4 years ago|reply
[+] [-] atatatat|4 years ago|reply
[+] [-] farmerstan|4 years ago|reply
Apple has not disclosed who gets to add new hashes to the list of CSAM hashes or what the process is to add new hashes. Do different countries have different hash lists?
Because if the FBI or CIA or CCCP or KSA wants to arrest you, all they need to do is inject the hash of one of your photos into the “list” and you will be flagged.
Based on the nature of the hash, they can’t even tell you which photo is the one that triggered the hash. Instead, they get to arrest you, make an entire copy of your phone, etc.
It’s insidious. And it’s stupid. Why Apple is agreeing to do this is disgusting.
And it doesn’t make sense. If I were a pedophile and I took a new CSAM photo, how long would it take for that specific photo to get on the list? Months? Years? As long as pedophiles know that their phones are being scanned, they won’t use iPhones for their photos. And then it will be only innocent people like me that get scanned for CSAM and potentially getting that used against me in the future.
If they really cared about CSAM, this feature is useless and stupid. All it does is make regular people vulnerable to Big Brother tactics which we know already exist.
[+] [-] fortenforge|4 years ago|reply
First: Apple has disclosed who gets to curate the hash list. The answer is NCMEC and other child safety organizations. https://twitter.com/AlexMartin/status/1424703642913935374/ph...
Apple states point-blank that they will refuse any demands to add non-CSAM content to the lists.
Second: Why can't the FBI / CCCP inject a hash into the list. Here's a tweet thread gamifying that scenario: https://twitter.com/pwnallthethings/status/14248736290037022...
The short answer is that at some point an Apple employee must visually review the flagged photo, and confirm that it does represent CSAM content. If it does not, then Apple is under no legal obligation to report it.
Third: You claim that abusers will simply opt not to use iPhones to distribute their CSAM content rendering the feature useless. This is in fact not how things have played out on other platforms like Google and Facebook that do already scan for CSAM. These organizations report on the order of millions of flagged images per year. [1] Clearly the abusers have simply not moved on to a different platform.
[1] https://www.businessinsider.com/facebook-instagram-report-20...
[+] [-] robertoandred|4 years ago|reply
- You need several hash matches to trigger a review
- The reviewer can of course see what triggered the review (the visual derivative)
- The reviewer would see that the matches are not CSAM, and instead of the report being sent on to the NCMEC it would instead start an investigation of why these innocuous images were matched in the first place
- If the CIA or FBI or CCP wanted to arrest you, there are much easier ways than this
[+] [-] tim333|4 years ago|reply
Apple basically controls you phone anyway and have done for years as they can issue patches and os updates.
Also you can turn iphotos off - I've never used the thing in spite of owning various apple devices. I do use Google photos and doubt they are much different in terms of checking for CSAM.
[+] [-] fy20|4 years ago|reply
After a 12 hour flight - that was of course delayed - Liam was pretty exhausted, but was looking forward to getting to his hotel in the center of Munich. He got to the front of the queue, and handed his passport over to the customs officer. The officer scanned Liam's passport, took it off the reader, and after 20 seconds asked "You flew from Los Angeles today?". Liam replied "Yes...". The customs officer, with his firm German accent, said "I need to check something with my colleague, wait here.". Not that there was anywhere Liam would go.
The customs officer came back with someone else who was slightly older and clearly more senior. The senior officer said "Come with me please", and led Liam to a room at the side of the customs hall. The officer said "Sit down please", indicating to the chair in front of the desk. The room looked like any other office, with a computer on a desk and chairs either side. The officer sat behind the desk and started typing something on the computer. After a few minutes he said "You are wanted by Interpol".
The customs officer explained to Liam that he had been flagged as a photo he had taken 6 months before included a known terrorist, and so by association Liam had been flagged. Liam asked how they accessed his photos - he is tech savy and only takes encrypted backups onto his own devices. The customs officer explained that they didn't need to, as this flagging had been done entirely by his device. The customs officer gave the date of the photo, and Liam found it on his device. He had been on holiday with his girlfriend in Paris, and they had taken a selfie. There was someone clearly visible behind them, and the customs officer explained that the facial recognition had identified this person. Due to privacy laws he wasn't able to say (or even see themselves) who this person was, only that they were on the highest German terrorist watch list.
From the photo it looked quite obvious that this person was just a passer by who glanced at the couple just at the moment they were taking a photo. The customs officer took Liam's fingerprints and asked Liam questions about his trip to Paris - typing the answers into the computer - and then the computer decided that Liam could be released. However the customs officer told Liam that he would be closely monitored while in Germany, and may receive 'check in' calls from police. He told Liam he must answer them otherwise a team will be dispatched to intercept him. Liam was then allowed to go on his way. He was only delayed by 45 minutes, but it wasn't a great start to his holiday in Germany. 3 years later when he visited Germany again the same thing happened, at least he knew what to expect this time...
(This is partially based on something that actually happened to me. Nearly a decade ago my passport was stolen, and every time I go to Germany I need to have a fun conversation with customs officers. Every other country I've visited - including the US - let's me through without even mentioning it)
[+] [-] Barrin92|4 years ago|reply
people really need to retire this meme. On the desktop in particular as a dev environment Linux is completely fine at this point. I can understand people not wanting to run a custom phone OS because that really is a ton of work but for working software developers Fedora, Ubuntu whatever any mainstream distro is at this point largely hassle free.
[+] [-] mastazi|4 years ago|reply
PinePhone is still in beta and according to its own creators "aimed solely at early adopters"[1], while Librem 5 is experiencing supply chain issues with backorder shipping now scheduled to resume in October[2]
There is a version of the Librem 5 which is made in USA and it's in stock and shipping now, but unfortunately outside of my budget[3].
I was also considering getting something like Fairphone and installing an alternative OS but looking at compatibility charts there are some things that may not work with one OS or another.
So, right now I can't have a daily driver that is not iOS or Android, I will hold onto my very old smartphone and hope that things will change in the next year or so. I'm working from home for the foreseeable future so I can wait a bit.
[1] https://pine64.com/product/pinephone-beta-edition-linux-smar...
[2] https://shop.puri.sm/shop/librem-5/
[3] https://shop.puri.sm/shop/librem-5-usa/
[+] [-] ajsnigrutin|4 years ago|reply
[+] [-] physicles|4 years ago|reply
[+] [-] grishka|4 years ago|reply
[+] [-] stetrain|4 years ago|reply
[+] [-] Componica|4 years ago|reply
[+] [-] ekianjo|4 years ago|reply
Thanks for depicting people who care about privacy and act on their beliefs as "total nerds", that's an encouraging attitude.
[+] [-] tuatoru|4 years ago|reply
Apple is a private company and as such its actions amount to vigilantism.
[+] [-] arespredator|4 years ago|reply
To anyone upset or offended by the Linux/nerd paragraph: please chill, and please forgive my tone.
I am a nerd myself indeed, and what I wanted to convey by this not-as-funny-sa-expected paragraph was that "going full nerd" is not a solution. There are ways to protect your privacy that will not be available to less tech-savvy people, and it's a problem. HN crowd will use Thinkpads with Arch on them, and phones with Graphene or whatever, but most people won't.
Yours, Absolute nerd and lover of desktop Linux since SuSE 6.0
[+] [-] atbpaca|4 years ago|reply
[+] [-] dev_tty01|4 years ago|reply
https://techcrunch.com/2021/08/10/interview-apples-head-of-p...
[+] [-] wpdev_63|4 years ago|reply
[+] [-] severak_cz|4 years ago|reply
This is best explanation of the whole situation I have read.
[+] [-] zug_zug|4 years ago|reply
[+] [-] FabHK|4 years ago|reply
> Taking action to limit CSAM is a laudable step. But its implementation needs some care. Naively done, it requires scanning the photos of all iCloud users. But our photos are personal, recording events, moments and people in our lives. Users expect and desire that these remain private from Apple. Reciprocally, the database of CSAM photos should not be made public or become known to the user. Apple has found a way to detect and report CSAM offenders while respecting these privacy constraints. When the number of user photos that are in the CSAM database exceeds the threshold, the system is able to detect and report this. Yet a user photo that is not in the CSAM database remains invisible to the system, and users do not learn the contents of the CSAM database.
https://www.apple.com/child-safety/pdf/Alternative_Security_...
[+] [-] rambambram|4 years ago|reply
[+] [-] id5j1ynz|4 years ago|reply
Currently it's been activated for CSAM only and only scans photos backed up to iCloud.
That's the framing I prefer and which much better explains the issue with it.
[+] [-] roody15|4 years ago|reply
[+] [-] johnvaluk|4 years ago|reply
[+] [-] GiorgioG|4 years ago|reply
[+] [-] raxxorrax|4 years ago|reply
I get why a safe environment is appealing. Parents know that their kids get milked by virtual goods in games or social media and don't know how to protect them from that. I think states are indeed responsible to set sensible boundaries for the industry to protect minors.
But this cannot lead to subject the whole net to it. Age verification is also not possible, so a protected environment is the way to go. The latter is difficult to advertise to developers because they also know about corporate ambitions to get their hands on market share.
Google isn't even the worst actor, more aggressive corps like Amazon are far more destructive in this field, but there isn't a single corp that is guilty here, so legislation also needs to protect free spaces. While seemingly in contradiction, this is also extremely important for digital education of future generations, even more so than questionable content in my opinion. Most here might have been subjected to that as kids. Was it that bad as generally assumed? This is a threat that should not be overblown. Parents feeling guilty neglecting their kids are extremely vulnerable to this line of thinking, even if they don't neglect their kids at all.
Many countries have rules against cartels, but there is a conflict of interest here. No country likes to split their most successful companies for nothing in an international market. So nobody does.