top | item 22131042

European Union Calls for Five Year Strict Ban on Facial Recognition Technology

208 points| Anon84 | 6 years ago |techgrabyte.com | reply

120 comments

order
[+] dang|6 years ago|reply
The EU hasn't "called for" anything. A discussion paper was leaked.

We ban sites that ridiculously overplay stories like this. Please submit from more reliable sources.

[+] ddevault|6 years ago|reply
I love coming to these threads to watch the crowd who makes their living from invading the public's privacy attempt to rationalize their worldview, find loopholes, etc. If your job is mass surviellance, it has always been unethical and the law is catching up to you. The purpose of these kinds of laws isn't to bring your business in line - it's to put you out of business. We're coming for you.
[+] ixtli|6 years ago|reply
The purpose of a law designed in the interest of individual privacy, yes, but at least in America companies get a huge amount of say on what gets passed because they vote with their profits. The reality is that most of the laws are bad because they're considerate of private profit accumulation when, in reality, that profit directly harms the privacy of individuals. We continue to pretend as a society that these two desires can be reconciled.
[+] throwaway713|6 years ago|reply
I have no problem if a company performs facial recognition on me as long as I give them permission and it is used for specific purposes. Why should the government interfere with my ability to decide how images of my face are used?
[+] Koremat6666|6 years ago|reply
These sort of decisions should not be made by large centralized agencies like EU. I would rather see these decisions being left to the individual countries and even better to local bodies.

May be London wants to take a different approach than Finland. That should be fine. Let each countries figure out what works for them.

[+] dirtyid|6 years ago|reply
We existed fine up until now without pervasive facial recognition, but I find it difficult to conceive a future where this technology won't participate in some part of civic life. China is plowing ahead with smart city and rapidly developing surveillance state standards. US is not far behind. Europe is just ceding 5 years for illiberal forces to set standards, normalize behaviour and cultivate acceptance when it should be providing an alternate model - which can't be nothing because these businesses with go on with or without her.
[+] no1youknowz|6 years ago|reply
Here is what's happening in the UK.

> https://www.youtube.com/watch?v=0oJqJkfTdAg

All I can hope for, is that some activist group takes the police to court and the legislative branch reacts to then impose rules for spying on its citizens.

Some of the things I can think off the top of my head:

1) Citizens can legally opt out by putting on face masks. Especially when it's cold.

2) Video / Images are stored outside of government bodies and akin to a black box. Must require warrants to review footage.

3) Video / Images / Data are deleted after 1 year.

4) No data of citizens facial features, body structure, gate are transferred into a national database.

Honestly though, where the UK is going. I firmly believe in 20 years all citizens physical meta-data will be tracked and stored in a black box somewhere and then later leaked on-line.

1984 isn't just a book. It's a handbook by all accounts.

[+] ixtli|6 years ago|reply
By many metrics the UK is and has been the most surveilled country in the world for decades.
[+] wott|6 years ago|reply
> 1) Citizens can legally opt out by putting on face masks.

You don't have (yet) some beautiful law like we have in France?

https://beta.legifrance.gouv.fr/loda/texte_lc/JORFTEXT000022...

> Nul ne peut, dans l'espace public, porter une tenue destinée à dissimuler son visage.

Noboby may, in public space, wear an outfit intended to hide his face.

Clear and simple, I guess. That was directed towards radical Muslims but the definition encompasses everyone. And last year we got this extra one:

https://www.legifrance.gouv.fr/affichCodeArticle.do?idArticl...

It is about hiding all (or part) of your face, within (or near...) a demonstration, in which troubles arose (or might have arisen...). The first law was an infraction, this one is a misdemeanour with much harder sentence.

[+] vintermann|6 years ago|reply
The algorithms are public, the data is public... even today, you can often feed a portrait image to yandex's image search (the best public one out there) and get back pictures of the same exact person. It seems to have some special case for faces.

And it's by no means certain that you and I can't do better. I think the reason Yandex image search is better than google's, is that Yandex entered the game later and thus could incorporate better methods from the start. There have been important advances in extreme classification even in 2019.

I think the best we can hope for is that this power of identification isn't exclusive to governments and police, but can be used by us as well. So that there aren't more Bob Lamberts than necessary.

[+] zelon88|6 years ago|reply
Here's an idea... Take down the cameras!
[+] nomel|6 years ago|reply
For 1, Body shape/bone estimations and gait can't be easily masks.
[+] denzil_correa|6 years ago|reply
I can't read the link as its down but here's an article that does a good job of explaining the leaked EU report [0].

> As for facial recognition, the Commission document highlights provisions from the EU’s General Data Protection Regulation, which give citizens “the right not to be subject of a decision based solely on automated processing, including profiling.”

It's more nuanced than "a ban on facial recognition". The leaked white paper is also available [1]

[0] https://www.euractiv.com/section/digital/news/leak-commissio...

[1] https://www.euractiv.com/wp-content/uploads/sites/2/2020/01/...

[+] asdfasgasdgasdg|6 years ago|reply
So let's say you're the police and you want to keep an eye out for bad guys, and you have a bunch of CCTVs. According to the rule you quoted, it would be sufficient to surface the face match to someone, who would manually check that it is the right person before going ahead and arresting them? That person could even be the police themselves? I bet people think this is banning a lot more behavior than it is.
[+] prophesi|6 years ago|reply
https://www.schneier.com/essays/archives/2020/01/were_bannin...

These bans have good intentions, but won't solve the actual root of the issues, and does more harm than good.

[+] Seenso|6 years ago|reply
> Regulating this system means addressing all three steps of the process. A ban on facial recognition won't make any difference if, in response, surveillance systems switch to identifying people by smartphone MAC addresses. The problem is that we are being identified without our knowledge or consent, and society needs rules about when that is permissible.

It sounds like the ban needs to be broader, and extend to ban any technology used to automatically identify individuals without their consent using surveillance sensors.

[+] tastroder|6 years ago|reply
Schneier doesn't seem to make the point that they do harm in that article. He also specifically closes by extending the bans on facial recognition with a warning against focusing on facial recognition too much. Which is of course correct. The specific critique about smartphone tracking as a fallback wouldn't be valid in the geopolitical context of this thread, which is specifically Europe. In the US that's of course still the case.
[+] tjoff|6 years ago|reply
I don't see how it isn't a fantastic start.

It signals intent.

[+] AdrianB1|6 years ago|reply
Romania is part of the EU. Officially there is no facial recognition system in use, but there are a few that are not official and those will not be affected by the ban because "they don't exist".

A relative was the lead policeman in a case where a person A accused a person B of a very serious crime. Person B appeared on a camera in the area, police identified him instantly based on the national database of ID cards (it contains photos of every person over the age of 14) and arrested the guy. Luckily for him, he was covered by a different camera for the whole time he was accused for the crime, but that was discovered after he spend a day in arrest. The person that make the false accusations simply walked.

There is a database of pictures of people that get an ID card (which is mandatory, not having one with you at all time grants you a fine). Another one for passports, but those are optional. Another one for driver's license (having a driver license with you does not save you from a fine if you don't have also the ID). These are databases everyone knows about and police are using every day, no authorization is required.

[+] bko|6 years ago|reply
I don't understand how you can ban a machine from doing something that a human can do.

Consider the following scenarios:

- I can pay for a person to watch archival video and take notes on paper as to who comes in and out of frame.

- I can have software that helps a person crop faces of people coming in and the person can tag and catalog the people coming in and out of frame.

- I can have software that identifies human looking things and things that look like faces and a person can tag those

- I can have software that identifies human looking things and things and recommends a similar face. A human confirms.

- I can have software do everything.

At which point does it become facial recognition? The end results are the same regardless of which step you ban. So is any ban just meant to make the cost artificially high? I think you could outsource it anyway via Mechanical Turk or something similar if there is a real value to facial recognition.

I don't think banning technology is the answer.

[+] cmendel|6 years ago|reply
The difference is scale. With ML Face recognition we can identify people in real time, without some safeguards we very rapidly reach a point where we can track anyone anywhere. Imagine if every camera in the city of London could identify you, forget metadata we will be able to actively track individuals taking public transit across a city and identify every individual who the interact with.
[+] _jal|6 years ago|reply
> I don't understand how you can ban a machine from doing something that a human can do.

We do it all the time.

- Phone autodialers are legal, but particular uses are fairly heavily regulated (in theory, at least).

- Heavy machinery use is regulated in a variety of ways.

- LEOs in the US recently were told GPS bugs (which "just" do what humans can do - taking notes on where a human goes) require warrants.

- Explosives dig much faster than humans. These are heavily regulated.

- Most radios sold in the US will refuse to tune certain bands for legal reasons, even though it is trivial to modify some of them to do so.

Etc.

As far as your slippery slope argument, it simply doesn't matter. The point is to reduce the harm done by a given tech, not achieve some sort of abstract purity of thought.

[+] WaxProlix|6 years ago|reply
None of these steps involves looking up a face in a seventeen million page binder of faces with names and associating names, government ids, traffic violations, credit scores, etc with the person.

I don't think people are worried about computer vision identifying Human vs Not Human as much as they are about the (not humanly possible) pinpointing and following of every move.

[+] Seenso|6 years ago|reply
> I don't understand how you can ban a machine from doing something that a human can do.

Quantity has a quality all its own. Something that is acceptable at a small scale may have unacceptable consequences when done at much larger scales.

[+] function_seven|6 years ago|reply
Same reason you might want place more restrictions on machine guns than on muskets.

The reason one would ban a machine from doing what a human can do is that it's stupendously faster, and enables all sorts of dystopian effects that manual face-tagging doesn't.

[+] reaperducer|6 years ago|reply
The difference is that machine-driven facial recognition can be collated with other data to create unwanted results for society.

Rando Startup can start scanning faces in public. Then it starts matching those faces with actual identities. Then it adds in the cell phone location data to determine which ones visit synagogues regularly. So it adds a little notation to the data table about each of these people. Perhaps a yellow star will do nicely.

If you know anything about recent European history, you will understand why this is a bad thing.

[+] feanaro|6 years ago|reply
The point is to prevent a reality where some people can track the whereabouts of all (or nearly all) people living in a city in an efficient way. You could of course try automating this using humans, but the cost will indeed be much higher, the efficiency much lower, and even this process itself might be banned if it were to lead to the same result.
[+] mattlondon|6 years ago|reply
Part of the problem with facial recognition is it is potentially inaccurate, especially if you are not a white male.

So you run the risk of having large numbers of innocent people being incorrectly tagged as persona non grata - be that with the police, in-store detectives, hotels, potential employers etc.

Having a computer say "this person is bad/unwelcome/a shoplifter/bad credit/a sex offender/etc" is powerful, and difficult for laymen to counteract. Computer says no - sorry, it's policy, nothing I can do.

And you can do this at huge scale for pennies and in the blink of an eye with a computer. You can't do that with a human manually/semi-manually doing it.

[+] lmkg|6 years ago|reply
Well, making the cost artificially high can be a point in itself. Sometimes the concern about a technology is the ubiquity and scaling.

But aside from that, to address your questions:

1. If there's a human in the loop, the human can be accountable. Explainability and accountability of algorithmic decision-making is an emerging concern, especially to the extent that algorithms may encode bias in their training data.

2. Real-time systems. Being real-time qualitatively changes the impact this can have on your day-to-day life. Taking the human out of the loop makes many applications feasible that otherwise wouldn't be. Like doors.

[+] TallGuyShort|6 years ago|reply
Other commenters have made the point that scale / quantity matters. Another example: Formula 1 has limits on computational performance. Humans can design aerodynamics. Machines can do it better, faster, more exhaustively. The latter is quantitatively limited.
[+] qihqi|6 years ago|reply
Scale does matter. Consider how your gut-feel attitude toward a thief that have stolen $1 vs a thief that have stolen millions.
[+] belinder|6 years ago|reply
> with exceptions for research and security projects

Isn't the whole point of facial recognition security projects

[+] netgusto|6 years ago|reply
Well no, tracking for targeted sales is a big use case I guess.
[+] Zenst|6 years ago|reply
Prudent approach, gives time to define more finely how, when and what it is used for etc.

After all, seems a fair few laws are a loophole away from enabling more crime than the law prevents. So a step back like this, does so in balance towards the consumer/people over government/business.

But like many things, there will always be exceptions and those that will take exception to them, which is fair as that is how democracy works - equal voice and often it has taught us that whilst today their may be a small child at the back questioning things, there may be more tomorrow and the next day. Showing that all questions need answers, this is a good start in enabling that. Will the public engage and have their say heard, or will the EU pull for pubic say and what balance will play out. We will know over the years and be great to see how far that goes 5 years from now, once the ban has ended, or been extended.

[+] Nasrudith|6 years ago|reply
Really the "public places" part of the definition betrays a fundamental informational illiteracy in that they don't acknowledge the separation of the data from where it is gathered.

Just a few silly demonstrative edge cases:

- If I run facial recognition software on a public camera feed on a computer in private would it be legal? - If in Europe in public and running facial recognition software on my laptop processing say my own personal family photo album would I violate the law? - Would running it on news footage of a public street be legal? What about if it was an interview where you would get thrown out if you tried to enter.

[+] skizm|6 years ago|reply
As a private citizen, would I be allowed to set up a camera on my front porch, point it at the sidewalk, and store all the data it takes in, including counts of how many times each "unique" person walks by, etc. Maybe even link it to my facebook, or a public directory and link all their public info to a profile and keep all the data on my servers? Does this chance if I am a business?
[+] huonpine|6 years ago|reply
I wonder in the future if we will look back and see intelligence agency's in the same light as religious institutions from history. Both using there "all knowing" information to influence power.
[+] jimmaswell|6 years ago|reply

[deleted]

[+] sstephant|6 years ago|reply
Technological progress? Might be. Social regression, Police state, sure 100% Don't count on me to be part of this. I'll give the finger to every company who offers me to be part of this wonderful endeavour.
[+] tastroder|6 years ago|reply
I'm really curious as to what technological progress this argument is supposed to refer to. You could build a great system surveilling the masses using cluster and image processing tech we had 15 years ago, what's changed is mostly throwing more hardware and money at something. Research seems to be specifically excluded. The EU job market wont go under because a few legitimate use cases might have to adapt, while blatant privacy violations can be put on hold until more complex legislation can be worked out.
[+] narrator|6 years ago|reply
GDPR is great for coverups of official corruption because corrupt officials can reliably destroy the evidence in the cloud if they get tipped off that an investigation is starting.

Bans on facial recognition means prosecutors of corrupt officials have to rely more on eyewitness testimony and eyewitnesses can be intimidated and have "accidents".

[+] franczesko|6 years ago|reply
Any disruptive technology should be preemptively locked before public use. Times of "move fast and break things" are gone for good.
[+] nashashmi|6 years ago|reply
How is it legally possible to ban computation over a set of data? Better yet, how do you prevent indie developers from doing the same computation on their personal machines?

If you ban the tech, then you prevent research on it as well. If you ban the use, you can not catch violators.