top | item 29005793

Bugs in our pockets: the risks of client-side scanning

204 points| azalemeth | 4 years ago |arxiv.org | reply

131 comments

order
[+] amatecha|4 years ago|reply
Completely agree with the final sentences in their conclusion/recommendations:

"In a world where our personal information lies in bits carried on powerful communication and storage devices in our pockets, both technology and laws must be designed to protect our privacy and security, not intrude upon it. Robust protection requires technology and law to complement each other. Client-side scanning would gravely undermine this, making us all less safe and less secure."

[+] Fnoord|4 years ago|reply
Does this count for something like AV, too? I grew up in a world where AV and anti malware only worked offline / client-side. What about spam filters and AV on mail servers? I often hear a commercial for Crowdstrike on Darknet Diaries podcast, which apparently is some kind of combination of a SIEM and ML (though that might be marketing). Would that be the panacea according to this paper? Or is this specific about a client-server model where the server hosts the data? Because the solution to that is rather simple 1) FOSS client 2) public-key cryptography where the private key does not enter the server. Sure, a cloud might make that illegal, but that's why the cloud is just someone else's server. Use your own instead.
[+] EastOfTruth|4 years ago|reply
I just disabled Google Play Services on my Android phone to increase privacy... then I started to get spammed with about 10 notifications every 10 seconds (not a joke) to tell me that those 10 apps would not work properly without Google Play Services enabled even if they did work properly... Google and/or LG allowed me to disable 7 of these apps, but the others could not be uninstalled or disabled using the GUI... I had to use ADB to remove them. One of those apps, believe it or not was the LG phone clock and another was the calculator.

After I removed all apps that were complaining about missing Google Play Services and installed alternatives for the ones that I needed like the calculator, everything was working fine. (thanks to f-droid for helping me find viable alternatives)

[+] krageon|4 years ago|reply
Don't disable google play services, run something like microg (easiest way to do so is probably https://lineage.microg.org/ ). Apps are written to expect you to be spied upon and in order for them to not push you to "fix" that, you must lie to them. Moving entirely to f-droid is not an option for most people, as they use their phones to communicate and doing so involves a certain amount of dirtying yourself.
[+] mdp2021|4 years ago|reply
Which Android version? Some of us have no Google Play Services and no relevant notifications on older devices.
[+] dsign|4 years ago|reply
Can we coin the word spy-tech and go on with our day?

I read a post here about when radioactive toys were popular, back in the early 1900s. Radioactivity fell so much out of grace, that these days we prefer to die of global warming than to use nuclear power.

In a similar vein, a world where most people go back to pen and paper and watching public TV on bootlegged devices so that they can (try to) escape continuous surveillance by electronic devices is conceivable. Give it enough time and grisly precedents, and companies like Apple and Samsung will only be able to sell new devices to corporate customers.

[+] snvzz|4 years ago|reply
Quite the roster of names behind the article.
[+] znyboy|4 years ago|reply
The first name that stood out to me was Ron Rivest, of RSA, MD5, and public-key cryptography fame.
[+] new_realist|4 years ago|reply
Eventually we’ll see cryptographic attestation of open source binaries on our phones. Until then, all popular phones will run closed source software, and it will be necessary to trust the vendor. Even then, the vendor may also be the chip maker. Apple would do well to look at sourcing an independent chip vendor for their on-device enclaves. That would give them a trust advantage over Android phones.
[+] mulmen|4 years ago|reply
http://users.ece.cmu.edu/~ganger/712.fall02/papers/p761-thom...

I don't believe this is a technology problem. This is a legislative problem. We (well, some of us) have the Bill of Rights and the justice system for a reason. We need to codify who owns our data (we do) and what third parties can do with it (nothing unless we say they can, individually).

LEO can't do a damn thing unless they have probable cause. This dragnet BS needs to stop.

We are absolutely empowered to change the rules to suit us.

[+] fastball|4 years ago|reply
> Eventually

This is far from a given.

[+] mschuster91|4 years ago|reply
> Apple would do well to look at sourcing an independent chip vendor for their on-device enclaves. That would give them a trust advantage over Android phones.

They are designing all of the critical chips (SoC and crypto/enclave) themselves already, TSMC only manufactures them - and if there is one company capable to verify that the chips that TSMC produce actually match the designs, it's Apple.

[+] tremon|4 years ago|reply
What makes you think cryptographic attestation will favor open source binaries? Don't you think it's much more likely that cryptographic attestation will tilt the field even more in favor closed-source binaries, given who hold the keys to everyone's kingdom?
[+] DeathArrow|4 years ago|reply
I believe law enforcement and spying agencies might be falsely complaining about cryptography. Doing this will make targets rely more on cryptography instead of also using alternative security measures. See: https://en.wikipedia.org/wiki/Dual_EC_DRBG
[+] martincmartin|4 years ago|reply
I wonder if the title refers to the song "Drugs in my pocket" by The Monks, 1979. This song was actually only popular in a few cities in the world, I think Toronto was a big one. I wonder if the author grew up there...
[+] lunatuna|4 years ago|reply
I'm in the libertarian lion's den . . . and I only read the abstract.

What I see from a historical standpoint, pre-cloud, mobile phone/computer, personal encryption etc is that anything stored be it something on paper, something in your house, safety deposit box, whatever was available to law enforcement with controls via the courts or other mechanism. It was available when there was a legal matter. Is there disagreement that legal matters should allow for full disclosure whether criminal or civil?

Is the problem that law enforcement and other state institutions through legislative channels and courts getting just too much access without legal justification?

Notional idea: could you have a key vault? Only with court order the keys are released and your devices get opened up? Even if not implementable would that work for most people?

I do get the government mass surveillance aspect and think that needs way more scrutiny and people should be vicious in their defense of themselves and society. But it also smells like we lost that battle as private companies are doing pretty well at surveilling individuals and communities.

[+] michaelmrose|4 years ago|reply
Let's rewind to look at your historical perspective properly this time.

In theory yes a court could get access to your papers but for the average person because person to person interaction produces no permanent record and what limited record is created by correspondence isn't nearly as permanent because it requires one to take the time to file it and manage a very finite space nothing anything like your present digital life would ever come to exist to be requested.

As important is the cost of such a request. Compare the risk and expense of actually spying on a large population, to the cost of tapping their phones and paying someone to listen to a million boring conversations to the cost of bugging everyone's communications at once with a box installed for that purpose.

If you consider how feasible it would be to obtain that much privacy invading mostly useless intelligence it wouldn't historically have been hard it would have been impossible. They could have flown to Mars easier than doing that much physical surveillance unless every third citizen was spying on the other 2.

Historically I suspect such requests to logically be rare outside of settling business matters where receipts and contracts would be common or wills/trusts.

If you want to justify some sort of key escrow system you shall not find justification in returning to some hypothetical state of nature that never was.

Neither mass use of unbreakable encryption nor mass surveillance enabled by court order, nor your suggestion bear any resemblance to historical reality.

Furthermore if we can't trust the government not to break its own laws what would keep them from simply ordering your keys confiscated and ordering whomever holds them not to tell you?

[+] mcoliver|4 years ago|reply
The difference is that our computing devices have become an extension of our very being. Our thoughts, actions, and interactions are recorded at a granular level never before possible in history, we often don't know what is actually being recorded, and that data can be stored cheaply, transfered instantly, replicated, and searched for mere pennies.

To expand on your comment, I can go in my file cabinet and flip through papers I have chosen to keep with data that I chose to write down, and then drop it in the trash can. It's physical and tangible and it doesn't have a record of who I called, where I went for breakfast, or what I ordered.

Our founding fathers were very clear on their intent with regards to how they viewed governments imposing on the private lives of citizens.

[+] krageon|4 years ago|reply
> Is there disagreement that legal matters should allow for full disclosure whether criminal or civil?

Yes, I don't agree with this.

> Is the problem [...] too much access without legal justification?

That is also a problem, which isn't new (it existed in the time you described in the opening of your post as well).

> Only with court order the keys are released and your devices get opened up?

This doesn't work (and we know it doesn't, there's so many times it failed already because of systemic issues with the idea), so as a hypothetical it only serves to distract from the fundamental truth that it cannot be a solution.

> I do get the government mass surveillance aspect [...] But [...] we lost that battle

With that mindset, you have indeed lost :)

[+] amarshall|4 years ago|reply
I would say your historical reference is flawed. One could just as well have written an encrypted text on a sheet of paper that would be non-trivial to decrypt (e.g. Z-340). The difference, I think, is that it used to be far more effort to create the encrypted text than it is today.
[+] beebeepka|4 years ago|reply
Wait till they start putting noses on those things. In fact, I am surprised no one has done it. It will be super convenient... for the people collecting said data
[+] mro_name|4 years ago|reply
IMO that boils down to "who owns what you use". And "there is no free lunch".

Applies both to services and devices.

[+] mulmen|4 years ago|reply
I paid Apple for my iPhone and for storage. I see no reason they should go out of their way to spy on me.
[+] krageon|4 years ago|reply
The free lunch argument is not satisfactory because no matter the price of what you buy, the lunch is never paid for.
[+] flerchin|4 years ago|reply
It's not their device to scan.
[+] aidenn0|4 years ago|reply
While I don't like client-side scanning, that's overly reductive.

"Client side scanning" (both in general, and in the recent Apple kerfuffle) is talking about a network client, that will be talking to servers that are owned by "them." If they wish to enforce rules over what is stored on their server then to enforce that right, the only two choices are to disallow E2EE or to perform client-side scanning.

Really client-side scanning is only up for debate when E2EE is used. The Javascript that checks validity of forms before you submit them is a form of client-side scanning, but most of the time[1] nobody cares because it's data that you intend to send to the server anyways.

1: Inadvertent pastes into fields that phone-home for e.g. autocomplete can reveal otherwise private information, so "most of the time"

[+] jt_thurs_82|4 years ago|reply
According to the TOS and their enforced end to end control of binaries and user actions, it is. Oops.
[+] new_realist|4 years ago|reply
It’s their cloud service. If you want to upload to iCloud, you must agree to use their client, and their client implements CSS. If you don’t want to use their client, don’t use their service.
[+] konfin|4 years ago|reply
Given how the average person and even the majority of people on tech have been acting the last 6 years I'm at the point where I don't care. I can protect myself, everyone else is their own responsibility.

The more we remove privacy by tech the less we lose it by law which I now think is the much worse outcome.

[+] ls15|4 years ago|reply
> I can protect myself, everyone else is their own responsibility.

How does that work if everyone expects you to communicate with them via Whatsapp and their Gmail, or even if you don't, they will happily backup all communication with you in the cloud?

[+] kf6nux|4 years ago|reply
> The more we remove privacy by tech the less we lose it by law

On its face, that seems like a false dichotomy. Can you expand?

Generally, I see the erosion of our right against unreasonable search and seizure to be something that hurts everyone (regardless of an individual's ability to make fewer searchable spaces).

[+] bradknowles|4 years ago|reply
So long as you are secure in your own systems, you don’t care if they come for the Jews? What about when they come for the gays?

How long until you do care? Will there be anyone left to care when they come for you?

[+] new_realist|4 years ago|reply
“It’s not perfect yet, so let’s not do it” is not much of an argument, especially from cryptography wonks with no experience in public policy or law enforcement.
[+] zdw|4 years ago|reply
Schneier at least has substantial experience and analysis on public policy - his "Beyond Fear" book published over a decade ago is one example.
[+] otoh|4 years ago|reply
On the other hand, CSS might -- eventually, anyway -- offer the best compromise for facilitating reliable, responsible lawful access to mass consumer information technology.

Develop CSS in a manner that minimizes the noted risks. Such mechanisms are a fundamental compromise, philosophically. I am skeptical that those on opposing ends of the privacy debate will find sufficient common ground to achieve responsible implementations.

Deeper concerns regarding the misprioritization of security in consumer infotech design prevent meaningful basis to realize a suitable compromise for CSS tech, anyway.

[+] akersten|4 years ago|reply
> Develop CSS

No, do not. There is no reasonable privacy preserving manner in which you can do so. Spyware is fundamentally incompatible with privacy. I don't care how many god damn whitepapers they write about their novel perceptual hash cohort-based homomorphic 0-trust TPM scanner. It's still a rat.

[+] AbrahamParangi|4 years ago|reply
Looking forward to the day that people publicly argue for client-side scanning on NeuralinkOS for thought-crime.