top | item 45935648

(no title)

least | 3 months ago

> Because you say that, we will lose what little figments of privacy and freedoms we have left.

I understand that you seem to think that adding systems like this will placate governments around the world but that is not the case. We have already conceded far more than we ever should have to government surveillance for a false sense of security.

> You can have a system that flags illicit content with some confidence level and have a human review that content. You can make any model or heuristic used is publicly logged and audited. You can anonymously flag that content to reviewers, and when deemed as actually illicit by a human, the hash or some other signature of the content can be published globally to reveal the devices and owners of those devices. You can presume innocence (such as a parent taking a pic of their kids bathing) and question suspects discretely without an arrest. You can require cops to build multiple sufficient points of independently corroborated evidence before arresting people.

What about this is privacy preserving?

> However, your response of "Yes." is materially false, law makers will catch on to that and discredit anything the privacy community has been advocating. Even simple heuristics that isn't using ML models can have a higher "true positive" rate of identifying criminal activity than eye witness testimony, which is used to convict people of serious crimes. And I suspect, you meant security, not privacy. Because as I mentioned, for privacy, humans can review before a decision is made to search for the confirmed content across devices.

It's not "materially false." Bringing a human into the picture doesn't do anything to preserve privacy. If, like in your example, a parent's family photos with their children flag the system, you have already violated the person's privacy without just cause, regardless of whether the people reviewing it can identify the person or not.

You cannot have a system that is scanning everyone's stuff indiscriminately and have it not be a violation of privacy. There is a reason why there is a process where law enforcement must get permission from the courts to search and/or surveil suspects - it is supposed to be a protection against abuse.

discuss

order

notepad0x90|3 months ago

> I understand that you seem to think that adding systems like this will placate governments around the world but that is not the case. We have already conceded far more than we ever should have to government surveillance for a false sense of security.

You have an ideological approach instead of a practical one. It isn't governments that are demanding it. I am demanding it of our government, I and the majority. I don't want freedoms paid for by such intolerable and abhorrent levels of ongoing injustice. It isn't a false sense of security, for the victims it is very real. Most criminals are not sophisticated. Crime prevention is always about making it difficult to do crime, not waving a magic wand and making crime go away. I'm not saying let's give up freedoms, but if your stance is there is no other way, then freedoms have to go away. But my stance is that the technology is there, it's just slippery slope fallacy thinking that's preventing from getting it implemented.

> What about this is privacy preserving?

Persons aren't identified before a human reviews and confirms that the material is illicit.

You have to identify yourself to the government to drive and place a license plate connected to you at all times on your car. You have to id yourself in most countries to get a mobile phone sim card, or open a bank account. Dragnet surveillance is what I agree is unacceptable except as a last resort, it isn't dragnet if algorithms flag it first, and it isn't privacy invading if false hits are never associated with individuals.

> you have already violated the person's privacy without just cause, regardless of whether the people reviewing it can identify the person or not.

There is just cause, the material was flagged as illicit. In legal terms, it is called probable cause. If a cop hears what sounds like a gunshot in your home, he doesn't need a warrant, he can break in immediately and investigate because it counts as extenuating circumstance. The algorithms flagging content are the gunshots in this case. You could be naked in your house and it will be a violation of privacy, but acceptable by law. If you said after review, they should get a warrant from a judge I'm all for it.

It is materially false, because that the scanning can be done without sending a single byte of the device. The privacy intrusion happens not at the time of scanning, but at the time of verification. To continue my example, the cop could have heard you playing with firecrackers, you didn't do anything wrong but your door is now broken and you were probably naked too, which means privacy violated. This is acceptable by society already.

The false positive rates for cops seeing/hearing things, and for eyewitness testimony is very high in case you're not aware. by comparison, apples csam scanner was very low.

> There is a reason why there is a process where law enforcement must get permission from the courts to search and/or surveil suspects

As stated above, so long as the scanning is happening strictly on-device, you're not being surveilled. When there is a hit, humans can review the probable cause, a judge can issue a warrant for your arrest or a search warrant to access your device.

Another solution might be to scan only at transmission time of the content, not capture and storage (still not good enough, but this is the sort of conversation we need, not plugging in of ears).

Let's take a step back. Another solution might be to restrict every content publishing on the internet to people positively identifying themselves.

least|3 months ago

> You have an ideological approach instead of a practical one.

It's both. We can save a whole lot of time and money not wasting resources on security theater and reallocate it towards efforts that actually make society better and safer.

> It isn't governments that are demanding it. I am demanding it of our government, I and the majority.

> I don't want freedoms paid for by such intolerable and abhorrent levels of ongoing injustice. It isn't a false sense of security, for the victims it is very real.

No, it still is a very false sense of security. Intercepting illicit material online doesn't actually stop the crime from being committed nor does it dissuade people from distributing it.

> Most criminals are not sophisticated. Crime prevention is always about making it difficult to do crime, not waving a magic wand and making crime go away.

Sure, but the 'criminals' that are distributing illicit material online are already going to lengths, sometimes very technical, to distribute it anonymously.

> I'm not saying let's give up freedoms, but if your stance is there is no other way, then freedoms have to go away.

You are saying let's give up freedoms. Let's drop any sort of notion that you care about freedom because you do not. I'm not saying that it's an invalid world view; your reasoning for wanting to eradicate those freedoms is rational and with good intention, but you are begging for authoritarianism none the less.

I don't think there's any sort of agreement to be had here. Fundamentally I cannot agree with the notion that everyone must concede their personal liberties and privacy in order to capture a few more stupid criminals.

> But my stance is that the technology is there, it's just slippery slope fallacy thinking that's preventing from getting it implemented.

No, it's actually just a slipper slope. There is no fallaciousness in the logic here because we've already witnessed the erosion of our rights for this purpose over and over again and they continue to push for even more degradation of those rights.

> Persons aren't identified before a human reviews and confirms that the material is illicit.

This is already a violation of privacy. Share all of your personal photos with hacker news if you disagree. We don't know who you are, after all, so it's not a violation of your privacy, right?

> There is just cause, the material was flagged as illicit. In legal terms, it is called probable cause. If a cop hears what sounds like a gunshot in your home, he doesn't need a warrant, he can break in immediately and investigate because it counts as extenuating circumstance. The algorithms flagging content are the gunshots in this case. You could be naked in your house and it will be a violation of privacy, but acceptable by law. If you said after review, they should get a warrant from a judge I'm all for it.

In legal terms, probable cause is what you need to make an arrest or before obtaining a search warrant. The "gunshot" exception isn't probable cause. It's an emergency exception that allows for a warrantless search because there is an independent, externally observable signal of imminent harm i.e. an emergency situation.

The algorithms are not the 'gunshot' here. It is not searching in response to some sort of external signal like a gunshot or hearing someone screaming or even seeing someone getting attacked. It is the search itself - it only produces a flag marking someone as suspicious because it has already examined someone's private files. The "probable cause" was produced by conducting the search. That is backwards.

It is equivalent, in your analogy, to a cop opening every front door in the neighborhood to look inside and then saying they now have probable cause because they saw something suspicious. The search already happened.

> It is materially false, because that the scanning can be done without sending a single byte of the device. The privacy intrusion happens not at the time of scanning, but at the time of verification.

You do not need to transmit information for it to be violation of privacy. If a cop opens your filing cabinet, looks through your folders, and leaves everything exactly where he found it, he's still already intruded by examining your private material.

The suspicion of criminal activity must precede the search. Simply possessing digital files isn't a basis for individual suspicion - you are treating everyone as a suspect that deserves no protection.

> To continue my example, the cop could have heard you playing with firecrackers, you didn't do anything wrong but your door is now broken and you were probably naked too, which means privacy violated. This is acceptable by society already.

Society accepts warrantless entry only when there is an actual emergency - The reason a gunshot or firecrackers can justify it is because they are external signals - they do not require the police officer to enter the home in order to detect it.

Society does not accept random entries just to look for problems.

And just to get ahead of it, a machine performing the search doesn’t change anything. A search is defined by what’s being examined, not who (or what) is doing the examining. If the government sent a robot into your home that didn’t know your name and only alerted authorities if it found something illegal, it would still be a search. The fact that it’s automated doesn’t make it any less of an intrusion.