This distinction of changing the defacto ownership of your device and data is the real inflection point. The surveillance technology itself is not really that novel, as functionally it's applying established anti-virus techniques to data instead of code. Ask any AV company how their detection works, and it will include a variation on this.
This same tech can (and will likely) be used to find the owners of bitcoin and other cryptocurrency wallets, honeypot tokens, community identities, and to provide profiling information to the company's political masters. The collisions in the hashing scheme mean that you can insert anything you want onto peoples devices and get them pulled into the legal system once it is flagged, where the process itself is the punishment. The whole scheme is too stupid to ever have been about reason, it's just pretexts and narrative, and this is as good a time as any to exit their ecosystem.
Apple really picked the wrong time to attempt this, as I do not see anyone who understands how evil this is ever forgiving them. The most charitable thing I can say about it is that they're probably just doing it as part of a deal to avoid anti-trust plays, where Apple plays ball with the feds and its parties, and the storm just magically passes them over. The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
> The surveillance technology itself is not really that novel
What's novel is that the tech reports you to the authorities. Imagine your AV reporting you to the authorities for digital piracy, it's something that RIAA could only dream of back in the day. Now it's becoming a reality.
> I do not see anyone who understands how evil this is ever forgiving them.
I'm not so optimistic.
How many people among Apple's users actually understand how evil this is, and among those, how many do really, actually care? People seem fine enough with Facebook's data vacuuming, why would they protest against Apple's "non-intrusive" scheme? They "don't hate children" and, of course, "have nothing to hide".
The issue, as has been brought up in one form or another in the numerous threads on the subject, is that people like their comfort (using smartphones) and there really isn't that much of a choice.
> The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
And therein lies the rub. Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you.
"get them pulled into the legal system once it is flagged, where the process itself is the punishment"
This is the real threat here. Anyone can have data flagged at any time, by accident or maliciously. Like how any video can be flagged for copyright infringement and the creator is 'punished by the process' regardless of guilt/innocence. A possible fix would be to have severe financial punishments for every false claim (lets say a million bucks per instance). Imagine how careful the system would be designed if that were the case, verses the case where there is no punishment for false claims.
> This distinction of changing the defacto ownership of your device and data is the real inflection point.
So the ability to store child porn is what constitutes "de facto ownership" in your mind?
But why would they "use this tech to hunt down bitcoin owners"? They could just scan emails or photos directly. Doing it by way of neural hashes and vouchers seems like an absurdly complicated detour when they already own the OS and all the most commonly used apps.
>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?
In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.
It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.
What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.
Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are
A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)
B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).
It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.
I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.
The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.
Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.
This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".
How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Simple: Money.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).
That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.
So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".
But now that #1 occurred, it will normalize this nonsense and pave the way for #2.
I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.
There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.
But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.
> They could have already scanned our files because they already have full control over the entire ecosystem.
Apple barely submits any CSAM[0]:
> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.
That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.
> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.
With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.
This is a very well-written post. Ever since this program has been announced I have struggled with talking about the implications succinctly.
Online, I never know if an interlocutor is even arguing in good faith, but even in person it's difficult to balance talking about all the ways that the claimed safeguards are meaningless, how the benefits don't really make sense, how this is markedly different from other infringements on privacy with the need to be concise and explain that the real problems aren't just theoretical because similar invasions of privacy are killing actual people around the world already.
Anyway, I think the only practical way that this could resolve well, is if Apple saw a precipitous decline in its iCloud brand, then it could be argued that they had to abandon this plan for purely business reasons. A serious movement to abandon Apple services ($17.5B revenue in 2021 q3), might empower the people within Apple who opposed this reckless plan from the beginning.
Australia has already shown what the end-game is, with its "The Assistance and Access Act 2018" [1]. It's not illegal to have end-to-end encryption, but it's illegal to deny access to the ends of the encrypted pipe.
As an aside, Australia has just implemented the next step: the "Surveillance Legislation Amendment (Identify and Disrupt) Bill 2021" [2], which makes it legal to hack your device to access the ends of the pipe. Useful if the ends of the pipe are not controlled by a malleable corporation.
This actually sounds like the right way to go. Individual, warrant based access is comparable to wiretapping in a way that Apple's dragnet approach is not.
The media drastically overreacted to that act, to the point where the Department of Home Affairs now has an entire page dedicated to addressing the false reporting [0].
The TL;DR is that the act doesn't allow the government to introduce mass surveillance. Section 317ZG [1] expressly forbids any law enforcement request from _having the effect_ of introducing any systemic vulnerability or weakness and _explicitly_ calls out new decryption capabilities as under that umbrella. Your claim that a company can't deny access to the ends of an e2e-encrypted pipe is false.
And yes, that new act exists. The government will be able to hack into your devices and take over your accounts _with a warrant_, just like they can break into your house or take money from your bank account _with a warrant_.
I have a modest suggestion in the spirit of Apple's move.
As we know, there are people in the world who are running meth labs or creating explosives for terrorists in their homes. In order to safeguard the public, we shall have a detachment of dogs which will sniff everyone's houses every once in a while. When they sense something bad they'll alert their handlers and there'll be a manual inspection before reporting to police.
There's no risk to privacy here - dogs being dogs can't tell their handlers what they sense. We can also show the training publicly so people can verify the iDogs are trained to only sense drugs or explosives. So it's all even more secure than Apple's iPhone scanning! What says you?
I understand you're making a reductio ad absurdum argument here, but this is actually very similar to what LEO often tries to do today (e.g. searches based on what is smelled / seen inside your car at a traffic stop) and actually iDog might be constitutional.
The constitutional standard for a warrant search is "probable cause", and for a warrantless search you generally also need exigent circumstances. Assuming that a judge is sufficiently satisfied with the iDog's nose, and the iDog was sniffing somewhere public like the sidewalk when it found the meth smell, you could likely establish both probable cause (iDog smells meth) and exigent circumstances (meth labs often blow up, meaning there's emergent danger that cannot risk waiting for a warrant).
That's not to excuse Apple, just to provide a fun backstory on the things law enforcement gets to do in this country.
Another one that was nearly deemed constitutional: in Kyllo v United States, LEOs used thermal imaging to find an Oregon man's house was radiating a high amount of heat indicative of intense grow lights, which they used as probable cause to search the home for an illegal pot growing operation. This was only found unconstitutional by a 5-4 decision in the supreme court. If it were found constitutional, you can imagine we'd have helicopters flying overhead thermal imaging for pot operations today.
The article mentions the slippery slope and "what happens in a year or two when..." scenarios. The article even calls it a cliff. But doesn't expand on timelines of concern.
As it currently stands, this concept would be sitting in plain sight waiting eternally for any lawmaker anywhere.
In your country, either side of the political spectrum - with a majority in lawmaking - can simply tap Apple on the shoulder and potentially turn ALL those devices against you.
Guns won't help. When information technology is used against you, when you are separated from society in a manner where people dare not risk their own livelihood for fear of being similarly marked.
And if Apple goes ahead with this, this risk is sitting there for the rest of your life just waiting for [that one politician that represents everything you hate] to use it against you.
Maybe that politician hasn't been born yet. But they will come. Don't let this Pandora's Box sit waiting for them.
It's the same reason free speech is something akin to sacred even for your worst enemies, because those who start taking away the bad people's speech are themselves always going to be one political actor away from having theirs taken away.
For the privacy-minded Apple users among us (I mean, that's who they marketed to, yeah?), I'd recommend turning off automatic software updates... For as long as it makes sense to. I hope they reverse their decision, but I'm already looking for alternatives. I'm certainly not buying another Apple device, even though I'm about due.
They really lost a lot of fans with this, myself included.
In the essay the "i" in the headline is lower case, which is significant and chilling. It's a homunculus of Apple's new direction: the meaning changed from "me" to "panopticon".
I don't know... I have a really hard time getting too upset about this. I'm a big proponent of privacy and have always been a Snowden supporter. And while "protecting children" is a trope in politics, I think everyone with an iPhone knows they're giving up some privacy to own one. It's constantly tracking their location and sending other data to Apple.
This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.
I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.
I stopped work on a memo app. Was piggybacking on Apple's branding around privacy. Am going to wait a year to see how this shakes out. Super disappointed.
IMHO this is really another continuation of the "you will own nothing and be happy" trend that has been around for a while, but companies have started to really push in the last few years. Slowly eroding ownership and normalising mass surveillance is their goal, so they can continue to extract more $$$ out of you.
When Apple sells yet again another record amount of iPhones next quarter which device should we move to?
Fundamentally this illustrates that software has become too inherently intrusive. What’s the solution tho that could ever be mainstream?
The other issue is that software has become too complicated and too many (potentially) bad things are happening in the background. How can the layperson fight back?
I think iPhone is superior and would hate to leave it. Although I barely do much with my phone outside 2FA, browsing, and texting. A switch won't be too bad in that regard.
I do love my Macbooks though. So while these privacy invasions make me angry I'm not willing to drop Apple all together. I've been meaning to keep most of my sensitive information on a usually disconnected Linux machine anyways. I'll keep using my Macbook for development.
I see this Apple move as a warning.
I have lived part of my life under communistic regime.
For me my Apple addiction ends here. There is no "magic" left in their products, only "bait & switch" dark patterns.
No hardware or UX will lure me again to suppress my instincts.
This is the beginning of global politically and financially motivated race for public control.
Apple is just giving a spark to the fire. Imagine a future in which your beloved Face ID will be tied to everything, your beloved iDevices, Teslas, or home appliances will scanning and reporting, scanning and reporting. There is no middle ground in this for me. No benefits or conveniences are so important. FOSS and public oversight of software must be demanded by law.
I don’t understand why this outrage seems so US-centric. This is the same Apple that hands over all your iCloud data (photos and otherwise) to the CCP if you happen to live in China. And they’ve done this openly for the last several years.
What am I missing? Isn’t that a much much much much worse thing for Apple to do? Why are we only suddenly suspicious of Apple’s privacy claims with this matter?
> the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.
A very overdramatic sentence.
It is a bit scary to realise that only now people think that this border is being crossed. It has happened a very long time ago already.
The first years of Android, owners were the product, not the phone. Privacy features in the past years might have improved this a little.
Google’s massive success on many services is based on the fact how phones and their software were collecting data for them. User interfaces are just illusions for non-tech persons. They might give you a sense of control.
Now that Apple does not trust us with CSAM material, the end is near. There are arguments for both sides, and many are taking sides to just get attention.
However, you can only solve this problem with politics.
Privacy advocates need to be like second amendment activists. We need to use their playbook. They raise a big stink about anything, no matter who big or small, that could curtail their rights. No number of Sandy Hook events will result in meaningful changes in laws.
Pushing everyone to Linux will eventually lead to all hardware falling under some national security law, allowing hardware to be imported if only they allow certain OSs to be installed on them and boot loaders will be locked.
Free market has no impact here, the masses don't care. And privacy supporters are too logical to whip up any type of movement.
Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.
[+] [-] motohagiography|4 years ago|reply
This same tech can (and will likely) be used to find the owners of bitcoin and other cryptocurrency wallets, honeypot tokens, community identities, and to provide profiling information to the company's political masters. The collisions in the hashing scheme mean that you can insert anything you want onto peoples devices and get them pulled into the legal system once it is flagged, where the process itself is the punishment. The whole scheme is too stupid to ever have been about reason, it's just pretexts and narrative, and this is as good a time as any to exit their ecosystem.
Apple really picked the wrong time to attempt this, as I do not see anyone who understands how evil this is ever forgiving them. The most charitable thing I can say about it is that they're probably just doing it as part of a deal to avoid anti-trust plays, where Apple plays ball with the feds and its parties, and the storm just magically passes them over. The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
[+] [-] armada651|4 years ago|reply
What's novel is that the tech reports you to the authorities. Imagine your AV reporting you to the authorities for digital piracy, it's something that RIAA could only dream of back in the day. Now it's becoming a reality.
[+] [-] vladvasiliu|4 years ago|reply
I'm not so optimistic.
How many people among Apple's users actually understand how evil this is, and among those, how many do really, actually care? People seem fine enough with Facebook's data vacuuming, why would they protest against Apple's "non-intrusive" scheme? They "don't hate children" and, of course, "have nothing to hide".
The issue, as has been brought up in one form or another in the numerous threads on the subject, is that people like their comfort (using smartphones) and there really isn't that much of a choice.
> The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
And therein lies the rub. Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you.
[+] [-] Eddy_Viscosity2|4 years ago|reply
This is the real threat here. Anyone can have data flagged at any time, by accident or maliciously. Like how any video can be flagged for copyright infringement and the creator is 'punished by the process' regardless of guilt/innocence. A possible fix would be to have severe financial punishments for every false claim (lets say a million bucks per instance). Imagine how careful the system would be designed if that were the case, verses the case where there is no punishment for false claims.
[+] [-] hypothesis|4 years ago|reply
[+] [-] Grustaf|4 years ago|reply
So the ability to store child porn is what constitutes "de facto ownership" in your mind?
But why would they "use this tech to hunt down bitcoin owners"? They could just scan emails or photos directly. Doing it by way of neural hashes and vouchers seems like an absurdly complicated detour when they already own the OS and all the most commonly used apps.
[+] [-] mucholove|4 years ago|reply
[+] [-] ytetsuro|4 years ago|reply
This is good news.
[+] [-] slg|4 years ago|reply
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?
[+] [-] croutonwagon|4 years ago|reply
It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.
What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.
Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are
A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)
B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).
[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...
[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...
[+] [-] kevin_thibedeau|4 years ago|reply
[+] [-] fshbbdssbbgdd|4 years ago|reply
The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.
[+] [-] barsonme|4 years ago|reply
It’s just how life and politics work.
[+] [-] noptd|4 years ago|reply
How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?
[+] [-] rkagerer|4 years ago|reply
Simple: Money.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).
That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.
So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".
But now that #1 occurred, it will normalize this nonsense and pave the way for #2.
[+] [-] int_19h|4 years ago|reply
https://news.ycombinator.com/item?id=28239506
I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.
[+] [-] diebeforei485|4 years ago|reply
But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.
[+] [-] NicoJuicy|4 years ago|reply
They did do it in emails since 2019: https://www.indiatoday.in/technology/news/story/apple-has-be...
[+] [-] judge2020|4 years ago|reply
Apple barely submits any CSAM[0]:
> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.
0: https://www.hackerfactor.com/blog/index.php?/archives/929-On...
[+] [-] polishdude20|4 years ago|reply
[+] [-] xunn0026|4 years ago|reply
[deleted]
[+] [-] swiley|4 years ago|reply
[+] [-] lvs|4 years ago|reply
[+] [-] sillystuff|4 years ago|reply
Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.
With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.
[+] [-] rz2k|4 years ago|reply
Online, I never know if an interlocutor is even arguing in good faith, but even in person it's difficult to balance talking about all the ways that the claimed safeguards are meaningless, how the benefits don't really make sense, how this is markedly different from other infringements on privacy with the need to be concise and explain that the real problems aren't just theoretical because similar invasions of privacy are killing actual people around the world already.
Anyway, I think the only practical way that this could resolve well, is if Apple saw a precipitous decline in its iCloud brand, then it could be argued that they had to abandon this plan for purely business reasons. A serious movement to abandon Apple services ($17.5B revenue in 2021 q3), might empower the people within Apple who opposed this reckless plan from the beginning.
[+] [-] cyanydeez|4 years ago|reply
[+] [-] femto|4 years ago|reply
As an aside, Australia has just implemented the next step: the "Surveillance Legislation Amendment (Identify and Disrupt) Bill 2021" [2], which makes it legal to hack your device to access the ends of the pipe. Useful if the ends of the pipe are not controlled by a malleable corporation.
[1] https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...
[2] https://www.aph.gov.au/Parliamentary_Business/Bills_Legislat...
[+] [-] Tarq0n|4 years ago|reply
[+] [-] torified|4 years ago|reply
Australia is a bad joke at this point. I thought nanny state was bad but we're firmly moving into Stasi territory now.
Even the government's obvious incompetence is looking like not enough protection in the face of all these overreaches.
[1] https://mobile.twitter.com/efa_oz/status/1430674903548661767
[+] [-] matheusmoreira|4 years ago|reply
[+] [-] Youden|4 years ago|reply
The media drastically overreacted to that act, to the point where the Department of Home Affairs now has an entire page dedicated to addressing the false reporting [0].
The TL;DR is that the act doesn't allow the government to introduce mass surveillance. Section 317ZG [1] expressly forbids any law enforcement request from _having the effect_ of introducing any systemic vulnerability or weakness and _explicitly_ calls out new decryption capabilities as under that umbrella. Your claim that a company can't deny access to the ends of an e2e-encrypted pipe is false.
And yes, that new act exists. The government will be able to hack into your devices and take over your accounts _with a warrant_, just like they can break into your house or take money from your bank account _with a warrant_.
[0]: https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...
[1]: http://classic.austlii.edu.au/au/legis/cth/consol_act/ta1997...
[+] [-] diebeforei485|4 years ago|reply
Also the whole "Apple is planning to encrypt iCloud Photos end-to-end anyway" thing is just fanfiction. I'll believe it when they announce it.
[+] [-] yyyk|4 years ago|reply
As we know, there are people in the world who are running meth labs or creating explosives for terrorists in their homes. In order to safeguard the public, we shall have a detachment of dogs which will sniff everyone's houses every once in a while. When they sense something bad they'll alert their handlers and there'll be a manual inspection before reporting to police.
There's no risk to privacy here - dogs being dogs can't tell their handlers what they sense. We can also show the training publicly so people can verify the iDogs are trained to only sense drugs or explosives. So it's all even more secure than Apple's iPhone scanning! What says you?
[+] [-] helen___keller|4 years ago|reply
The constitutional standard for a warrant search is "probable cause", and for a warrantless search you generally also need exigent circumstances. Assuming that a judge is sufficiently satisfied with the iDog's nose, and the iDog was sniffing somewhere public like the sidewalk when it found the meth smell, you could likely establish both probable cause (iDog smells meth) and exigent circumstances (meth labs often blow up, meaning there's emergent danger that cannot risk waiting for a warrant).
That's not to excuse Apple, just to provide a fun backstory on the things law enforcement gets to do in this country.
Another one that was nearly deemed constitutional: in Kyllo v United States, LEOs used thermal imaging to find an Oregon man's house was radiating a high amount of heat indicative of intense grow lights, which they used as probable cause to search the home for an illegal pot growing operation. This was only found unconstitutional by a 5-4 decision in the supreme court. If it were found constitutional, you can imagine we'd have helicopters flying overhead thermal imaging for pot operations today.
[+] [-] himaraya|4 years ago|reply
[+] [-] _carbyau_|4 years ago|reply
As it currently stands, this concept would be sitting in plain sight waiting eternally for any lawmaker anywhere.
In your country, either side of the political spectrum - with a majority in lawmaking - can simply tap Apple on the shoulder and potentially turn ALL those devices against you.
Guns won't help. When information technology is used against you, when you are separated from society in a manner where people dare not risk their own livelihood for fear of being similarly marked.
And if Apple goes ahead with this, this risk is sitting there for the rest of your life just waiting for [that one politician that represents everything you hate] to use it against you.
Maybe that politician hasn't been born yet. But they will come. Don't let this Pandora's Box sit waiting for them.
[+] [-] Covzire|4 years ago|reply
[+] [-] cryptoquick|4 years ago|reply
For the privacy-minded Apple users among us (I mean, that's who they marketed to, yeah?), I'd recommend turning off automatic software updates... For as long as it makes sense to. I hope they reverse their decision, but I'm already looking for alternatives. I'm certainly not buying another Apple device, even though I'm about due.
They really lost a lot of fans with this, myself included.
[+] [-] hirundo|4 years ago|reply
[+] [-] bhawks|4 years ago|reply
Since we're doing this on device we can just turn the camera on every few minutes and ask an ai if the camera sees something interesting.
If it sees that you're in trouble it can start streaming to the authorities.
We will finally be safe.
By the way, if your phone is off or left at home we will know you're in trouble and send assistance right away.
I wish I could say /s.
[+] [-] gkoberger|4 years ago|reply
This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.
I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.
[+] [-] musesum|4 years ago|reply
[Edit]
Here is/was the privacy statement https://www.deepmuse.com/privacy - I'm kinda embarrassed.
[+] [-] userbinator|4 years ago|reply
[+] [-] Animats|4 years ago|reply
[1] https://archive.org/details/ThePrisoner01Arrival
[+] [-] endisneigh|4 years ago|reply
Fundamentally this illustrates that software has become too inherently intrusive. What’s the solution tho that could ever be mainstream?
The other issue is that software has become too complicated and too many (potentially) bad things are happening in the background. How can the layperson fight back?
[+] [-] mrits|4 years ago|reply
I do love my Macbooks though. So while these privacy invasions make me angry I'm not willing to drop Apple all together. I've been meaning to keep most of my sensitive information on a usually disconnected Linux machine anyways. I'll keep using my Macbook for development.
[+] [-] nbzso|4 years ago|reply
For me my Apple addiction ends here. There is no "magic" left in their products, only "bait & switch" dark patterns.
No hardware or UX will lure me again to suppress my instincts.
This is the beginning of global politically and financially motivated race for public control. Apple is just giving a spark to the fire. Imagine a future in which your beloved Face ID will be tied to everything, your beloved iDevices, Teslas, or home appliances will scanning and reporting, scanning and reporting. There is no middle ground in this for me. No benefits or conveniences are so important. FOSS and public oversight of software must be demanded by law.
Posted earlier this without getting any reactions. https://docplayer.net/1287799-Fourth-amendment-search-and-th...
[+] [-] whatgoodisaroad|4 years ago|reply
What am I missing? Isn’t that a much much much much worse thing for Apple to do? Why are we only suddenly suspicious of Apple’s privacy claims with this matter?
[+] [-] nicce|4 years ago|reply
A very overdramatic sentence. It is a bit scary to realise that only now people think that this border is being crossed. It has happened a very long time ago already. The first years of Android, owners were the product, not the phone. Privacy features in the past years might have improved this a little.
Google’s massive success on many services is based on the fact how phones and their software were collecting data for them. User interfaces are just illusions for non-tech persons. They might give you a sense of control.
Now that Apple does not trust us with CSAM material, the end is near. There are arguments for both sides, and many are taking sides to just get attention.
However, you can only solve this problem with politics.
[+] [-] 091234mnbvcxz|4 years ago|reply
Privacy advocates need to be like second amendment activists. We need to use their playbook. They raise a big stink about anything, no matter who big or small, that could curtail their rights. No number of Sandy Hook events will result in meaningful changes in laws.
Pushing everyone to Linux will eventually lead to all hardware falling under some national security law, allowing hardware to be imported if only they allow certain OSs to be installed on them and boot loaders will be locked.
Free market has no impact here, the masses don't care. And privacy supporters are too logical to whip up any type of movement.
Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.