top | item 11650035

Facebook sued for storing biometric data mined from photographs

213 points| huntermeyer | 10 years ago |cnet.com | reply

129 comments

order
[+] mosquito242|10 years ago|reply
I had the strangest interaction with the Messenger app a few months ago.

I was spending time with friends, and I took a few pictures of all of us (didn't send them through either FB or Messenger). A few hours later, messenger popped up a notification telling me something along the lines of "Hey, I see you took pictures today of <friend>. Want to send them to her?"

Made me feel incredibly creeped out that FB would take my photos and (presumably upload and) analyze them even when I hadn't given them to FB.

[+] onewaystreet|10 years ago|reply
It's called Photo Magic:

"By recognizing your Facebook friends in the photos you take (just like when tagging or sharing photos on Facebook), Messenger can create a group thread for you to share the photos with those friends in just two taps."

https://newsroom.fb.com/news/2015/12/messenger-adds-new-feat...

http://www.theverge.com/2015/11/9/9696760/facebook-messenger...

You can opt-out by turning off tagging suggestions: https://www.facebook.com/settings?tab=timeline&section=sugge...

[+] notliketherest|10 years ago|reply
People also use to think putting hour real name on the internet was creepy. Little by little the privacy erosion has brought us where we are today - standing naked in front of a corporate monster profiting off our data.
[+] fiatjaf|10 years ago|reply
Try getting pictures downloaded from other sources directly to your phone and see if Facebook prompts you the same.

Try getting pictures of other people and see if Facebook suggests them to be added as your friends.

[+] thiht|10 years ago|reply
Wow I never heard of that before, that's insane...

The worst thing is you can't "boycott" the app because even if you don't use it but your friends do, the pictures of you WILL be analyzed. Even if you explicitly refused their ToS...

[+] Propen|10 years ago|reply
Are these kind of things possible with WhatsApp? (on iOS specifically, and now that it even uses e2e) I don't know if I should trust them with Facebook being behind the wheel...
[+] cpach|10 years ago|reply
That’s quite disturbing. Was it on iOS or Android?
[+] awqrre|10 years ago|reply
Creeped out that FB used the permissions that you gave them? I am guilty too, those permissions are too wide...
[+] patrickbolle|10 years ago|reply
were they just stored on your phone ?
[+] AckSyn|10 years ago|reply
Deny the application permission to access your camera, photos, and contacts. That's a start.
[+] chinathrow|10 years ago|reply
If that is how it went down, that alone is a base for another lawsuit.
[+] TazeTSchnitzel|10 years ago|reply
Facial recognition is really scary. This was recently demonstrated for Russia's Facebook, VKontakte, when a service appeared that let you look up people on that site by photo. So people started looking up the profiles of random subway passengers, outing sex workers to their families and friends, etc.
[+] sametmax|10 years ago|reply
That's not the scary part.

The scary part is that people are feeding the facebook DB with pictures, and many more data. They did it, they are doing it, and will keep it up, despite everything we told them.

The scary part is that nobody cares.

[+] rvense|10 years ago|reply
I really don't like having my picture taken anymore. You don't where it's going to end up.

I also do wonder how much infrastructure would need to be added in a country that already has a lot of video surveillance (like the UK) to implement a "find this person" feature, where you could just feed it a photo and it would go looking at all camera feeds.

[+] proksoup|10 years ago|reply
Scary, but inevitable, no?

As technology improves, it's hard to imagine regulations keeping pace with preventing this sort of thing ... And even if they did, then it would only be used by criminals and governments (while I may call it criminal, they would likely exclude themselves from any such regulation), and I'm not sure that situation (criminals and governments being the only users of the technology) is a better situation than the technology being available to all.

Is it?

[+] Esau|10 years ago|reply
"Facebook hoped to get the case thrown out on the grounds that its user agreement states any disputes should be governed solely by California law".

What about those people whose biometric data is being stored but don't have accounts of Facebook?

[+] amelius|10 years ago|reply
Let me just guess ... probably the people who uploaded that data are liable for it.
[+] gcr|10 years ago|reply
To be clear: the article claims that Facebook can recognize anyone just by seeing them. They haven't demonstrated this ability.

It's far easier to judge who's in your picture among your 20 closest friends than it is to find that face among all two billion Facebook users.

State of the art recognition systems still have a few orders of magnitude of accuracy improvements to go before they can solve that problem.

See the Iarpa Janus project for some recent government+academic-sponsored work on this front: http://www.nist.gov/itl/iad/ig/facechallenges.cfm The task is to recognize terrorists in airport surveillance pictures and such.

[+] Chronic51|10 years ago|reply
Using only facial recognition cannot identify a person in the world. However, combined with one or two GPS points (photo geotags), you can get pretty damn close.
[+] gcr|10 years ago|reply
If you don't want your face to be recognized, you can sometimes prevent it from being detected in the first place.

Our research group recently looked into how to hide from Facebook's face detector. If you're uploading photos, adding white bars around the eyes of the subjects in the photo is the best way to prevent Facebook from finding the face. However, if you're out and about in the real world, even scarves and masks aren't enough--facebook sometimes finds these occluded faces too. There aren't very many easy answers. https://arxiv.org/abs/1602.04504

[+] sudojudo|10 years ago|reply
Years ago, I read about a hat that some Hollywood stars had found useful for fighting paparazzi. Mind you, I don't recall the source, and it my have been tabloid nonsense.

The hat had a circle of IR LEDs (or something of the sort) emitting 360 degrees of invisible-to-the-naked-eye light. Photos of anyone wearing the hat had a similar effect to using a flash in a mirror, the subject was completely washed out by light.

Again, I'm not sure if this was bogus or not, but it seems the demand for such a product is only going to become greater. Hell, I'd buy one.

Does anyone with expertise in this area know if this type of device is technically possible? I'm voting no, otherwise we'd be seeing these devices everywhere, and there would be laws against wearing them.

[+] maaku|10 years ago|reply
That only works until people start doing it. Then it retroactively stops working.

The only way to prevent your face from being recognized is by not having your face in a photo to begin with.

[+] superobserver|10 years ago|reply
Not merely will it store biometric data, if you happen to use their Messenger to share photos, those photos will remain stored on their servers even after you've deleted the associated conversation. Facebook's disregard for privacy and content control for its users is the biggest problem with its platform by far. And with its latest profit reports, it's bound to get worse.
[+] vvanders|10 years ago|reply
Yup, people laughed when photographers started moving off FB to other platforms that had a better perspective towards photo rights.

Won't use FB for a lot of what's already been mentioned in the thread.

[+] AndrewKemendo|10 years ago|reply
I think technologists need to help the public understand how if they are going to continue to fully integrate technology in assisting with their lives, they will be moving closer to a "privacy free" future.

The trade-off between having tailored services and "smart" systems is privacy from machine systems.

The technology community pines for futuristic technology like amazing machine personal assistants, forgetting that human personal assistants (see: professional executive secretaries) know basically everything about the person they are assisting.

[+] Normal_gaussian|10 years ago|reply
In the case of the human personal assistant the governing terms originate with the person being assisted, and so the data and the ways it can be used remain in control of the owner.

Tech companies are inverting this control, which has very dangerous implications for those they assist.

There are ways to keep control with the user, however these require a level of architecture inversion the consumers haven't fought for yet.

[+] KhalilK|10 years ago|reply
I attended a talk by Stallman last year and he implored anyone who took a picture of him not to post it on Facebook. Now I can see why.
[+] cpeterso|10 years ago|reply
Staying off Facebook is not enough. Facebook could easily write a web crawler to find off-site photos of people tagged on Facebook. They could even create shadow accounts to track tagged people who don't have Facebook accounts. Even people not tagged could have shadow accounts seeded from Wikipedia or news photos.
[+] tajen|10 years ago|reply
The second information which I see having potential is the EXIF of the pics, the off pixels and other identifying data of the camera (flare, blur, watermark), artificially introduced or not.

Not only it proves who we with whom at the same time. It also links profiles. You can create a whole new identity; but if you didn't throw away all your phones and cameras, then Facebook can link the profiles. Or if you sell something on Craigslist and upload on Snapchat with the same camera: It's tied to your identity.

[+] cmelbye|10 years ago|reply
Was it just to make a point? Anyone can Google "Richard Stallman" to retrieve more than enough training data to train an image classifier.
[+] AndrewKemendo|10 years ago|reply
There are more than enough pictures/videos of Stallman to make a really clean faceprint.
[+] FreedomToCreate|10 years ago|reply
Facebook was great when it was a way to connect with close friends and people in your immediate community (ex. university) and see what to happening around you. Now thats it trying to evolve into this platform that targets you so that it can make an insane amount of ad revenue, it has become less and less worth it. Especially in respect to our privacy.
[+] Mendenhall|10 years ago|reply
I feel that was always the plan and people just didnt see it coming.
[+] visarga|10 years ago|reply
Maybe photos should carry a "robots.txt"-like permissions tag explicitly banning social networks from using it, and a cryptographic watermark embedded in the picture that is hard to remove, so they are not tempted to delete the tag.Cameras (and apps) should auto-tag files if the user so wishes.
[+] powera|10 years ago|reply
Humans have evolved for millions of years on the expectation that other people will recognize our faces. I'm sure we'll come up with a better solution than "file lawsuits to hopefully make computers recognizing our faces illegal".
[+] crisisactor|10 years ago|reply
To be honest, most of these are surface level traits of an individual. There are deeper traits which go much more personal, and have even been touched on in popular culture in recent times, like in the new James Bond movie (think gait recognition). But even gait, although highly individual, is still touching the surface. I was thinking of 'trimming the bloom filter' to such a degree that we can recognize not only a person on sight, but by cue words, and individual dictionaries alone.

It is no secret that our world is divided by language alone, so then as analysts we can attach certain words to certain behaviors, and this has been proven countless times to expose a person. If I speak English I probably respond in the same way to 'pizza'. Pizza means food, and therefore pizza emotes a pleasure response. But 'bomb' and other words must decide a different response then?

Marketers have copped this early on and frequently use talismanic phrases to elicit positive responses to products, so why not Facebook, and any other tech harvester of data such as Google, et al? Last time I checked it is not a crime to elicit responses using personal, and individualized key-phrases.

[+] matt4077|10 years ago|reply
Which reminds me of the researcher at Berlin's Humboldt University who ended up in jail because he was the only one at the time who used the word 'gentrification'.

Him, and this left-wing group setting cars on fire.

[+] breatheoften|10 years ago|reply
Photos permission access is too granular on iOS -- they should add a third access state for photos that grants an app permission to "write new photos to photo album, but prevents an app from scraping the photos library" ... Or maybe that should be the behavior of the current permission grant ...
[+] huevosabio|10 years ago|reply
Assuming that the concentration and mining of huge personal datasets is inevitable, can we design systems (other than law) that prevent these data to be misused?
[+] Mendenhall|10 years ago|reply
One of the exact reasons I never used the data mining center known as facebook, creepy. The things You can do with such data is legion.
[+] Jill_the_Pill|10 years ago|reply
Can you obfuscate your "faceprint" by tagging a few other people as you?
[+] sudojudo|10 years ago|reply
You'd have to be pretty vigilant, and it would require others to do the same. There'd have to be an equal, or greater, amount of photos tagged as another person. Who is this other person, anyhow? Where's their account? All data says it's Jill_the_Pill, so she must use pseudo-names. Flag her account for further investigation.

It's best to just stay off of Facebook if you care at all about privacy. At least, that's the underlying message that I get from this, and every other story I read about the site.