Some tips to maximise user privacy while deploying this tool:
1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.
2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.
3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.
4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.
I recently found myself in a position where I had to blur a ton of faces from multiple pictures (about 100/day).
It’s really tedious to do it manually and something like OpenCV shines.
We found a repo [1] with python code that automatically detects and blurs faces. This script was one of many, except it had a very high accuracy. Over 90%.
I’m reminded of a reddit thread a while back about the US government paying a large sum to create an “unblur” function for photoshop. Someone in the comments was able to rotate and flip a photo and use the photoshop blur tool to effectively undo a blur for free.
Perhaps it’s better to remove the section of photo with a person’s face instead? Or draw a shape over their face and flatten the image? It seems to me as long as the pixels are there the identifying data is there for anyone willing to spend the time and effort to find it.
There is also the PixLab API which let you automatically apply a blur filter for each detected face or any other target regions you want using only two Rest API endpoints.
For meta data, exiftool is handy for removing metadata[0].
$ exiftool -all= foo.jpg
And even better, save image first as .bmp or other format that doesn’t support metadata. Then reload and convert to jpeg, and run the exiftool on this image.
While I can't make useful comments on protests or strong anonymity, wrt photo metadata, I can say that I scrub metadata from photos that leave my possession, as a matter of course, using 'exiftool'.
Here is how you read the existing metadata:
exiftool -a -u -g1 IMG_0708.JPG | more
... and here is how you scrub it:
exiftool -all= IMG_0708.JPG
(you could read it again, after scrubbing, to demonstrate it is gone ...)
ImageMagick's `convert` also supports stripping such metadata, the flag is -strip.
I nearly always scale+compress photos that leave my possession, and usually using convert, so adding -strip is a nice streamlined way of doing all at once.
The protests were sparked by the lack of accountability of the police resulting in police brutality. The violent people among the protesters are subject to the same incentives. The more they expect to be held accountable, the more likely they will refrain from violence.
Anonymizing photos of the violent ones is therefore likely to support their actions by making accountability less likely. To scrub ethically, limit it to the non-violent protestors. To support non-violence, better to help identify the violent people -- police or civilian -- the opposite of anonymizing them.
Given that this is a protest about cops getting away with brutality even when there's clear evidence I think "gather evidence against both sides equally" is unlikely to be convincing argument to protesters.
To add though: it's not easy to tell who are the non-violent and who are the violent protestors. Sometimes violent protestors hang out in the crowd and only strike at opportune moments. If they're blurred out like everyone else during the times they are not violent then it becomes harder to hold them accountable.
> The more they expect to be held accountable, the more likely they will refrain from violence.
The thing is people are already being held accountable for their skin tone, and the likelihood of changing your behavior when you have lived your entire life in an environment of constant oppression for fear of being identified in a protest is marginal, specially during catharsis, otherwise you wouldn't see for instance people burning police cars in front of a camera.
Keep in mind also that many (most?) of these "violent protestors" are simply reacting against violent cops in a power trip. I can't say I wouldn't react violently against a cop intentionally running over me and others with its SUV, but I can say that I would be thankful if my face was anonymized no matter how I reacted.
When the US inevitably does turn into an authoritarian dictatorship, I think people would be happy if there wasn't copious amounts of proof they were on the streets protesting.
I should add: In all of the streams and pictures I've seen, all (most?) the looters or violent people were wearing masks, ensuring their anonymity. If anybody's being protected by measures like this, it's your average peaceful protestor.
I agree, and I'd go a step further and say that if you destroy evidence of someone burning my city down or looting, you're an accomplice to that crime.
"Crawford was found shot to death Thursday night in his car, just like activist Darren Seals in 2016 and protester DeAndre Joshua the night of the Ferguson verdict in 2014. The latter two had gunshot wounds to the head and their cars were lit on fire. Crawford, it is believed by police, shot himself in the back seat of his car either in an attempted suicide or by accident."
I agree entirely. Can we not blur out the faces of people who are looting (stealing), destroying property, defacing city and national sites and violently attacking people and store owners? This is super fucked.
Many organizers of protests in Furguson, peaceful or otherwise, have since been found murdered in ways that suggest they were literally hunted down and killed for their involvement. Multiple have been found shot through the head in burned out cars to destroy all evidence. If they broke the law it still does not merit being executed in the street. (https://www.nbcnews.com/news/us-news/puzzling-number-men-tie...)
In a situation where police feel justified to kill extra-judicially over a possibly fake 20 dollar bill, what hope do we have that protesters won't be targeted in unfair ways? Or worse, that organizers won't be hunted down like animals and murdered like in Furguson? It would be unethical to not do everything in your power to protect those in this position.
secondly how do you plan to identify violent vs non-violent protesters from a static image? How would you find their identity afterwards? There is overwhelming evidence to suggest these methods are at best ineffective and at worst racist, and in either case will lead to innocent people being charged.
The more that protestors refrain from violent property destruction, the less likely it becomes that the three cops who were accessories in the murder of George Floyd get charges brought against them. Burning down one precinct got one cop charged (albeit with a 3rd degree); I would really love to see the other three charged, even if it requires some anonymous protestors to light up three more precincts. I'd gladly trade in police destroying citizen lives for citizens destroying police property.
Really weird that nobody in the thread is pointing out that this is basically a website that says "give me your photos, specifically from protests, which have details that you want to keep private".
It doesn't matter that it theoretically all happen in the browser. You can serve different versions to different IPs etc. Every heuristic in me would be screaming don't use that if I would have a need for such tool.
The timing of the release of this tool seems a bit innapropriate, given the state of rioting in a few US cities now. It's going to be incredibly draining on law enforcement in the US for a few years to identify and prosecute criminals involved in riots. Most victims already who have lost their homes, their businesses, and even their loved ones will mostly likely never see the criminals brought to justice given the scale of the violence.
It could be useful to protect people from relatiation under an authoritarian government, such as in Hong Kong. I dislike the idea of a government using mass automatic identification, that could be used again by authoritarians for terrible goals. I also dislike the idea of the opposite and using automatic anonymizing to protect criminals during riots. We're probably going to keep seeing an arms race in this, with good and bad actors on all sides.
> “Like snowflakes, no two smartphones are the same. Each device, regardless of the manufacturer or make, can be identified through a pattern of microscopic imaging flaws that are present in every picture they take,” says Kui Ren, lead author of a new study describing the smartphone-identifying technology. “It’s kind of like matching bullets to a gun, only we’re matching photos to a smartphone camera.”
The protests are being live-streamed on Facebook, twitch, YouTube etc. So while they is interesting, it is ultimately useless. The data is already out there.
> How resilient is blurring against deconvolution?
This depends a lot on the implementation details. If you blur an image using arbitrary-precision real numbers, then blurring is invertible. If you add a bit of random noise, or quantize your pixels into a finite-precision data type, then it becomes essentially one-way, and you cannot recover the original image.
Instead of blurring you should add a significant amount of extraneous information (random noise) and then mosaic (downsample).
If you’d like to have a smooth looking censored image you can then blur the mosaic result to have a smooth transition between the censored and original image.
If you simply blur or simply downsample there’s a significant ability to recover data or iterate over data to recover likely inputs. Other posts have discussed deconvolution, but think of a downsample as a hash - you can build a rainbow table of inputs, easily for numbers, with more difficulty for faces. If you have a limited pool of “suspects” this technique can work well. Just as with hashing, you should add a salt to the image before downsampling or blurring to make recovery of the original input more difficult. In this case the “salt” is random noise.
Timely article, but what about violent criminal activity during protests? Peaceful protests are wonderful and have been very effective throughout history. The protests we've seen for the past few days are not helping anything. Yes, people are angry at the criminal behavior of the police officer/murderer, but manifesting that anger by destroying property, looting, injuring, and threatening others, is only going to justify the use of more police violence.
This is great for some situations. However, my personal policy is to upload my protest content to Google Photos asap (if appropriate). This makes sure your content is off your device if your phone gets confiscated AND it provides (be it thin) layer of authenticity/validation of the content.
Assuming this tool was intended for those who want to share photos on social media, Facebook and Instagram already strip metadata out from the photos you share otherwise anybody could scrape the photo and get metadata. I'm not sure for Twitter tho but I think it is the same.
Regarding metadata: Considering windows already lets you strip metadata from images directly from the properties menu, is there anything this tool or other metadata stripping tools do that goes beyond what windows offers? Is it risky to rely on the built-in tool?
[+] [-] Ansil849|5 years ago|reply
1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.
2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.
3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.
4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.
[+] [-] _bxg1|5 years ago|reply
[+] [-] samstave|5 years ago|reply
[+] [-] derhuerst|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] NightlyDev|5 years ago|reply
Any examples? You can't reverse it if the data is gone.
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] shivekkhurana|5 years ago|reply
It’s really tedious to do it manually and something like OpenCV shines.
We found a repo [1] with python code that automatically detects and blurs faces. This script was one of many, except it had a very high accuracy. Over 90%.
Removing exif data is a great idea.
[1] github.com/telesoho/faceblur
[+] [-] elliekelly|5 years ago|reply
Perhaps it’s better to remove the section of photo with a person’s face instead? Or draw a shape over their face and flatten the image? It seems to me as long as the pixels are there the identifying data is there for anyone willing to spend the time and effort to find it.
Edit: Apparently it was interpol, not the US government. I can't find the reddit thread but here's a NYT article with the photo: https://thelede.blogs.nytimes.com/2007/10/08/interpol-untwir...
[+] [-] symisc_devel|5 years ago|reply
Python: https://gist.github.com/symisc/6ecdea76ba0d33d73ea7f23cade0d...
PHP: https://gist.github.com/symisc/d54808915093e5375fdcb841e4365...
Docs: https://pixlab.io/cmdls
[+] [-] draw_down|5 years ago|reply
[deleted]
[+] [-] rixrax|5 years ago|reply
$ exiftool -all= foo.jpg
And even better, save image first as .bmp or other format that doesn’t support metadata. Then reload and convert to jpeg, and run the exiftool on this image.
[0] https://www.linux-magazine.com/Online/Blogs/Productivity-Sau...
[+] [-] dicknuckle|5 years ago|reply
[+] [-] rsync|5 years ago|reply
Here is how you read the existing metadata:
... and here is how you scrub it: (you could read it again, after scrubbing, to demonstrate it is gone ...)[+] [-] LeoPanthera|5 years ago|reply
[+] [-] pengaru|5 years ago|reply
I nearly always scale+compress photos that leave my possession, and usually using convert, so adding -strip is a nice streamlined way of doing all at once.
[+] [-] hirundo|5 years ago|reply
Anonymizing photos of the violent ones is therefore likely to support their actions by making accountability less likely. To scrub ethically, limit it to the non-violent protestors. To support non-violence, better to help identify the violent people -- police or civilian -- the opposite of anonymizing them.
[+] [-] michaelt|5 years ago|reply
[+] [-] Mirioron|5 years ago|reply
[+] [-] pera|5 years ago|reply
The thing is people are already being held accountable for their skin tone, and the likelihood of changing your behavior when you have lived your entire life in an environment of constant oppression for fear of being identified in a protest is marginal, specially during catharsis, otherwise you wouldn't see for instance people burning police cars in front of a camera.
Keep in mind also that many (most?) of these "violent protestors" are simply reacting against violent cops in a power trip. I can't say I wouldn't react violently against a cop intentionally running over me and others with its SUV, but I can say that I would be thankful if my face was anonymized no matter how I reacted.
[+] [-] Sevaris|5 years ago|reply
I should add: In all of the streams and pictures I've seen, all (most?) the looters or violent people were wearing masks, ensuring their anonymity. If anybody's being protected by measures like this, it's your average peaceful protestor.
[+] [-] jsaxton86|5 years ago|reply
[+] [-] tehjoker|5 years ago|reply
https://www.theroot.com/ferguson-activists-are-dying-and-it-...
"Crawford was found shot to death Thursday night in his car, just like activist Darren Seals in 2016 and protester DeAndre Joshua the night of the Ferguson verdict in 2014. The latter two had gunshot wounds to the head and their cars were lit on fire. Crawford, it is believed by police, shot himself in the back seat of his car either in an attempted suicide or by accident."
[+] [-] whatshisface|5 years ago|reply
[+] [-] djsumdog|5 years ago|reply
[+] [-] adge|5 years ago|reply
In a situation where police feel justified to kill extra-judicially over a possibly fake 20 dollar bill, what hope do we have that protesters won't be targeted in unfair ways? Or worse, that organizers won't be hunted down like animals and murdered like in Furguson? It would be unethical to not do everything in your power to protect those in this position.
secondly how do you plan to identify violent vs non-violent protesters from a static image? How would you find their identity afterwards? There is overwhelming evidence to suggest these methods are at best ineffective and at worst racist, and in either case will lead to innocent people being charged.
https://www.newscientist.com/article/2109887-police-mass-fac...
[+] [-] milquetoastaf|5 years ago|reply
[deleted]
[+] [-] blotter_paper|5 years ago|reply
[+] [-] comboy|5 years ago|reply
It doesn't matter that it theoretically all happen in the browser. You can serve different versions to different IPs etc. Every heuristic in me would be screaming don't use that if I would have a need for such tool.
[+] [-] JabavuAdams|5 years ago|reply
[+] [-] hangphyr|5 years ago|reply
It could be useful to protect people from relatiation under an authoritarian government, such as in Hong Kong. I dislike the idea of a government using mass automatic identification, that could be used again by authoritarians for terrible goals. I also dislike the idea of the opposite and using automatic anonymizing to protect criminals during riots. We're probably going to keep seeing an arms race in this, with good and bad actors on all sides.
[+] [-] mleonhard|5 years ago|reply
> “Like snowflakes, no two smartphones are the same. Each device, regardless of the manufacturer or make, can be identified through a pattern of microscopic imaging flaws that are present in every picture they take,” says Kui Ren, lead author of a new study describing the smartphone-identifying technology. “It’s kind of like matching bullets to a gun, only we’re matching photos to a smartphone camera.”
https://www.futurity.org/smartphones-cameras-prnu-1634712-2/
[+] [-] blhack|5 years ago|reply
[+] [-] iNate2000|5 years ago|reply
[+] [-] lapnitnelav|5 years ago|reply
[1] https://www.linkedin.com/posts/creativetech_tensorflowjs-bod...
[+] [-] leni536|5 years ago|reply
[+] [-] enriquto|5 years ago|reply
This depends a lot on the implementation details. If you blur an image using arbitrary-precision real numbers, then blurring is invertible. If you add a bit of random noise, or quantize your pixels into a finite-precision data type, then it becomes essentially one-way, and you cannot recover the original image.
[+] [-] anonymfus|5 years ago|reply
[+] [-] ibrarmalik|5 years ago|reply
[+] [-] afwaller|5 years ago|reply
If you’d like to have a smooth looking censored image you can then blur the mosaic result to have a smooth transition between the censored and original image.
If you simply blur or simply downsample there’s a significant ability to recover data or iterate over data to recover likely inputs. Other posts have discussed deconvolution, but think of a downsample as a hash - you can build a rainbow table of inputs, easily for numbers, with more difficulty for faces. If you have a limited pool of “suspects” this technique can work well. Just as with hashing, you should add a salt to the image before downsampling or blurring to make recovery of the original input more difficult. In this case the “salt” is random noise.
[+] [-] anonymousiam|5 years ago|reply
[+] [-] verdverm|5 years ago|reply
Is it because people are committing more crime than the original offense?
[+] [-] cowmix|5 years ago|reply
[+] [-] bobloblaw45|5 years ago|reply
All I was really concerned about was getting rid of metadata tied to my phone.
[+] [-] fnord77|5 years ago|reply
[+] [-] pabs3|5 years ago|reply
https://0xacab.org/jvoisin/mat2/ https://0xacab.org/jvoisin/mat2-web/
[+] [-] mrkramer|5 years ago|reply
[+] [-] Felk|5 years ago|reply