And if you think GDPR is a toothless joke, let's take a look at the defined fine stucture.
It is pretty simple, only 3 levels (strikes for the fellow Americans):
Strike 1 - Stern warning letter
Strike 2 - 2% of your TOTAL GLOBAL REVENUE
Strike 3 - 4% of your TOTAL GLOBAL REVENUE (or 20mil EUR, whichever is higher)
And now you know why GDPR is a board level topic. Keep in mind that the EU/US Safe Harbor agreement got axed due to a lawsuit of a single student from Vienna against Facebook. So all you need is a single pissed off German customer you ignored when asking for their data report card and you're fucked.
For startups - GDPR is like Y2K at the time, a GOLDMINE. So much opportunity to sell solutions, from real to snake oil. GDPR compliance is already and will continue to trigger a massive wave of investment.
I have national sales responsibilities for one of the majors. Think IBM/Microsoft/Oracle/etc leading a sales team of 74 reps.
You'd be surprised at how LITTLE sales we've generated from GDPR. We've been providing free GDPR assessments for the past 1.5 years for over 200 accounts as lead gen opportunity and very little sales have resulted.
It all boils down to companies simply don't believe the fines will be enforced given just how expensive the fines are.
And since GDPR doesn't go into affect until May 2018, companies are just waiting and seeing what happens.
It's really hard to sell GDPR because it's essentially an insurance policy. Why spend $5m on software and another $5m in services ($10m combined) if your total fine is only $20m. Do you as a company have a 50% chance of getting fined? If not, then roll the dice and not buy a solution.
I wish more penalties were like this. This sounds great. Now these companies will finally have real incentive to comply. Hell, the percentages should be higher. That's the only way to enforce regulations. Otherwise, they'll just pay a puny fine, American style, and not do shit.
The reality is that the 2% or 4% figure depends on which part of the GDPR you have breached and isn't a tiered approach. Breaches of core requirements (for example valid processing grounds) will attract a 4% fine straightaway technically speaking.
Art 83 covers the different triggers.
Having said that there are various schools of thought around levels of potential fines, including from different data protection authorities in the EU.
Considering the ICO in the UK is yet to levy a maximum fine despite egregious violations is at least one factor to suggest fines will not increase dramatically.
Also there are not only increased fines to consider but an increased focus on compensation to data subjects in the event of a violation of their rights (together with an evolving case law to support that in the UK at least).
I'll eat my shoe if a regulator ever gets even 2% total global revenue out of any of the top 100 software companies based on this.
In reality a bunch of small shops are going to go bankrupt because they don't have a "GDPR implementation" position filled and they didn't do some report properly.
For startups - GDPR is like Y2K at the time, a GOLDMINE.
Unless you're a start-up that handles personal data, in which case it's another bureaucratic overhead that also carries a risk of draconian penalties if you make a mistake, even if you have perfectly sensible reasons for working with that data and you're not doing anything at all surprising or dubious with it.
Of course, the EU has form for this, given its similar approach to both consumer protection and VAT rules in recent years. It does seem to have an unhelpful habit of imposing regulations at big business scale to deal with big business scale problems, but not considering that both of these may be wildly disproportionate for smaller businesses.
> Keep in mind that the EU/US Safe Harbor agreement got axed due to a lawsuit of a single student from Vienna against Facebook.
Almost every US supreme court decision come from a single guy challenging something. Are you suggesting that under a certain size, one shouldn't be allowed to sue in court?
For startups - GDPR is like Y2K at the time, a GOLDMINE. So much opportunity to sell solutions, from real to snake oil. GDPR compliance is already and will continue to trigger a massive wave of investment.
I'll start by saying that I have found myself leaning in favor of this law -- I've made a much longer comment about it and won't rehash it, but I wanted to make sure my statements that followed weren't taken as a blanket anti-GDPR but rather a devil's advocate response.
I take issue with the quoted statement because it ignores the downside for startups[0]. Companies like Google, Facebook et. al., have the money and time to hire teams of lawyers to find a way to work around these regulations in a manner that maximises their ability to continue tracking while minimising their risk in getting smacked by the hand of the law. Getting hauled off to court won't bankrupt them and they have the legal teams to probably win regularly enough. Even barring that, they have the finances to adjust their business to be fully compliant (in whatever degree business adjustments require) without going bankrupt.
Joe's Advertising Supported Free Service does not. Joe's not going to start his own ad network and start mining personal data for it -- it's way too expensive to try to compete with Google/Facebook (and it was already way too expensive to do it, before). If he does, he's the one who's going to get hit with the second and third strike; probably from that "single pissed off German customer".
Investing in firms that touch this space will be met with far more skepticism. I wouldn't be surprised if any company that simply asks for a user ID and password won't face a little scrutiny from investors, at least until the regulatory atmosphere is understood (I doubt it'll be that extreme for terribly long, but one bad court ruling/fine laid out where it wasn't expected could change that). The cost of establishing many, many kinds of companies will now increase because the risks are high enough that going to market without having your legal bases covered on this one. That money has now been shifted to a business who -- potentially -- is selling snake-oil (and startups are going to be more likely to do business with that snake-oil salesman since they'll probably also be the least expensive).
Then there's the "unintended consequences". Here's a crazy hypothetical, but a lesser variation of it is plausible if this were a US law: Some individual exercises his free-speech rights and chucks something up on the Internet that has a bunch of horrible things on it, say, like 'a guide on how to slaughter and prepare kittens for healthy and inexpensive dinners'. Some kid reads it and kills/eats his neighbor's cat. The guy didn't do anything illegal, really, but his web host knocks him off the web and people are calling for blood. He happens to use an ad-network, but doesn't, himself, collect personal information. However, this ad-network does, and at one point was nailed under this law. He uses the ad-network, so some overzealous prosecutor figures out a way to bring it in front of a judge that he's responsible for what this third-party did and should also be prosecuted. At the height of outrage, a jury isn't hard to find to connect the dots[1].
[0] And hey, that's fine, you're an internet commenting individual just like me -- we don't have to present both sides of the story -- that's the replier's job.
[1] Yeah, I took that a little far, but I think back to when "The Columbine Massacre" happened and everyone believed it was FPS video games that warped those evil children's "precious little minds". It took all of two seconds to call for banning violent video games (constitution be dammed), many idiotic and ultimately overturned laws were passed, and if there was some way to haul the developers who wrote the game off to jail (I think they were blaming DOOM at the time), it would have been possible within those first few weeks.
Strike 3 - 4% of your TOTAL GLOBAL REVENUE (or 20mil EUR, whichever is higher)
Why would they impose a 20M limit and not stick to 4% revenue irregardless of it...
edit: I'm not sure that the down voting is about.. it's still a limit, a limit that means a company turning over 0-500M will pay up to a 20M fine.. not so bad the closer you get to 500M but not so great if you're a small company, especially so as the regulation is so open to the "law of unintended consequences" right now and only larger companies will have the funds / man power to navigate it.
While I see some of the concerns about the _technicality_ of the law as completely legitimate, it still bothers me that so many people reject the whole spirit of this law, and cannot put the negative of "tax on startups" against the much greater good of personal privacy.
I've just started a business myself, and this regulation affects my company too. It makes development costlier; it'll take from the precious little time we have to spend on compliance paperwork rather than work on our core business. In the short run, it does hurt our chances of success.
Yet, none of the trouble is even comparable to what's to be gained here. And it bothers me (though doesn't surprise me) that some people don't see that.
It also bothers me that such vocal opposition barely comes up when the discussion is just about bigger companies such as Google and Facebook. How can we expect "un-evilness" from bigger companies when we're barely willing to do anything in that regard ourselves?
I wonder how this will interact with accounting standards. Ledgers have historically been immutable. If I buy something from you and then demand deletion of my data, do you need to revise all previous financial statements to make it appear as if the transaction never occurred?
The spirit of the law is nonsensical. It makes all commercial activity illegal, to the extent that all businesses keep records of their sales, inventory, etc. which reflect the activities of their customers, employees, and suppliers.
Possibly because personal privacy as a "much greater good" is open to debate. People place wildly differing values on that property.
Google's current ecosystem of data-sharing means that Assistant can make educated context guesses on what I mean when I talk to it based on my browser history and map navigation history. If the new privacy constraints damage that passive interconnection, that's not a net good for me.
I encourage a little more thought before cheering this on as a win. While GDPR isn't as ridiculous as the Cookie Law, it still shows that the EU/EC don't understand the technology they are trying to regulate, and it comes at a huge cost to tech companies.
Take the right to be forgotten. First of all, it should be common sense that no one has the right to force legitimate news articles to disappear because they don't like the content, but that is what the EU has ruled should happen.
I get the desire to have a company forget about you, and remove all the personal information they have. It makes sense from a personal standpoint. But how do you do it technically?
If you follow GDPR strictly you would need to be able to purge the data from your backups. Now most backups are considered immutable, so you aren't going to do that, meaning you need a way to ensure that "forgotten" users never get restored.
But how do you even delete the live data? Does the tech company you work for have the ability to delete all traces of a user from their system, cleaning severing all relationships with other objects in your system? Do you have the ability to retrieve everything you know about a specific user, and provide it to them? You will need to write the code to do this.
There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
Do you sell a SaaS service to businesses, and those businesses send you their customer's data? Then you are the processor and they are the controller. Cool, less for you to do, sort of. Except that controller must agree to every sub-processor you use. Want to switch from AWS to GCP? You can only do it if all your customers agree. Want to use try out a new metrics or logging service? If it will have any PII you can't do it without customer (controller) permission.
You will basically need to hire full-time compliance officers to deal with this. The big tech companies already have compliance officers, but GDPR is so massively invasive to businesses that even small companies now need compliance officers.
> First of all, it should be common sense that no one has the right to force legitimate news articles to disappear because they don't like the content, but that is what the EU has ruled should happen.
No, it is about deleting personal data attached to your user account, not "news articles". This thing intends to make the "delete my account" button to actually, you know, "delete my account", instead of fake-deleting it by setting a "deleted" flag and telling me that everything is gone now while still keeping gigabytes of data associated with me in your database.
> [...] meaning you need a way to ensure that "forgotten" users never get restored.
If this is considered to be a hard problem, then I assume storing some list of deleted users in a separate place and immediately purge those users from the backup after restore must be some kind of rocket science.
> There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
I wouldn't call it "to become GDPR compliant", I would call it "to build a sound database structure". Because if you are unable to purge all data associated to one of your users' accounts from your system without destroying the integrity of the rest of your data, then you obviously have a half-baked system at your hands that lacks a core feature - to actually delete accounts. And you surely should spend some of your money to refactor this crap into a long-term viable solution while you are still small and agile enough to do that. Because it's only going to be way more expensive later on...
Doesn't it just affect companies which rely heavily on lack of privacy for monetisation? I think that's sort of the point - that your business should not rely on tracking individuals and selling that information without their consent to gov/private bodies. It's obviously a huge change, since so many big tech players rely on this to make profits. But the internet will be a much nicer place for everyone else if right to privacy is protected.
You need to research the right to be forgotten as it applies to news articles, it doesn't work in the way you describe.
And as far as this stuff being difficult to do, sure, but isn't it worth doing? Why shouldn't a customer have a say which cloud provider hosts their data? Why shouldn't we be able to make sure no data is kept about us after we stop using a service? Like with anything novel in software it only seems hard to do because we haven't done it, but in a ground up design it's not that hard to add gdpr compliance, and a few years down the line this stuff will be business as usual.
You don't need a compliance officer, but you do need a security officer, and their job now also involves data lineage, not just data security. You already should have that person if you're building a SaaS solution.
> Take the right to be forgotten. First of all, it should be common sense that no one has the right to force legitimate news articles to disappear because they don't like the content, but that is what the EU has ruled should happen.
That is not correct. The right to privacy is not an absolute right. It has to be balanced against other rights, such as the right to free press. In a normal news article case, free press would prevail.
> There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
I advise a lot of small customers to implement manual procedures to retrieve or delete data in case a request for it might be done. And to set up a basic privacy and security policy which they should have had already. This doesn't cost much.
> Except that controller must agree to every sub-processor you use.
This can be a generic agreement where the processor notifies the processor.
> Want to switch from AWS to GCP? You can only do it if all your customers agree.
Not true, you do however need to be able to tell customers what companies receive their data. Which can be quite a challenge with sub-sub-subcontractors.
Want to use try out a new metrics or logging service? If it will have any PII you can't do it without customer (controller) permission.
Not true if the processing agreement contains a clause that instructs processor to perform metrics or logging. Customer consent is often not needed unless it has big impact on their privacy. Consent is only one of the legal grounds.
> You will basically need to hire full-time compliance officers to deal with this. The big tech companies already have compliance officers, but GDPR is so massively invasive to businesses that even small companies now need compliance officers.
If this were true I'd be a lot busier. It would be wise if companies assign the responsibility for privacy and security, but it doesn't always need to be a full time job with a level background.
> You will basically need to hire full-time compliance officers to deal with this. The big tech companies already have compliance officers, but GDPR is so massively invasive to businesses that even small companies now need compliance officers.
Or they could, I don't know, just not collect that data in the first place.
First of all, it should be common sense that no one has the right to force legitimate news articles to disappear because they don't like the content
I think a lot of EU people (I'm not one) would disagree with you here. The notions of privacy and of the goals of the criminal-justice system in several parts of Europe are radically different from the notions your "common sense" position is based on, which means that what seems "common sense" to them seems ludicrous to you, and vice-versa.
This has nothing to do with the size of your company/startup and it has nothing to do with regulatory compliance. It is a pretty simple at its core: if your company/startup gets breached and as a result PII data leaked, then you are liable for the penalty according to the general rules. I don't think anybody will argue this is a bad thing. If anything, it will help companies to be a little bit more careful with what sort of data they collect because frankly, at the moment almost every company is perhaps guilty of collecting far too much personal data under the assumption that one day it may become useful. If you collect PII data then you are liable for damages if you happen to mishandle it.
So here is how to avoid the GDPR penalties.
1. Get compliant - it is pretty much ISO27001 and it will cost you money
2. Don't collect excessive PII data and if you do, store it securely - after all it is a very basic ask
3. Avoid collecting PII data at all cost - think of it as another form of PCI
> The critical question for both businesses is whether users will click “yes”, when asked to consent.
Yes, users will click yes on basically anything. Facebook could put up a message that says "In order to proceed, click yes to give us half the money in your checking account" and the majority of Facebook users will still click through. Look at EU cookie warnings. Did any of those warnings noticeably impact anybody's traffic after the first week?
No, they won't. When the EU imposed new consumer protection rules not so long ago, it resulted in having to put some scary-looking legalese directly on your sales funnel pages if you were supplying digital content, even if said legalese was of no practical value to anyone including your customer. That alone was enough to hurt conversions, even if you didn't require something like a token checkbox to be ticked before continuing. The GDPR compliance requirements are potentially on an entirely different scale.
Clicking yes might not even be necessary: I recently went to a laywer-oriented event (IANAL) that discussed the GDPR and it had a cheerful talk about "Alternatives to Consent"
The talk listed all the possible ways the law allows you to store/manipulate user data without requiring explicit consent... There are a shocking number and iirc they apply basically whenever you have a direct consumer relationship with some company.
This is not even the fault of users. As a user, I want to get to the webpage I'm opening. I'll do whatever it takes to quickly reach there. I think "what's the worst they could have written". After all I haven't paid them anything. Expecting free users of Facebook to analyze a prompt on screen and legally consent to it is utter stupidity. Most people believe that clicking a checkbox isn't even legally binding.
In other words I know that clicking on a Facebook dialog box saying "you agree to give us 50% of your income" is meaningless and so I will click on it and use the website.
There is a big difference between cookie warning and GDPR. You could not op-out of cookie storage. You either used the service and accepted that they store cookies. Or you closed the tab and left.
GDPR prevents the companies from discontinuing service for users whom wish not to be tracked.
I see this as yet another tax on (European) startups who have to invest even more resources into regulatory compliance.
This prohibition of freely using all available data will create great arbitrage opportunity for the shadow economy, and will have a net negative effect on innovation.
I think prohibition has very bad side effects, and that MORE transparency is the way forward in politics, economy, and also society. This includes allowing businesses to use all the data they can get their hands on. People can produce infinitely more data than any google can realistically process.
I cannot understand why people who are otherwise for transparency and against prohibition are celebrating this as a big win against FB/AMZ/GOOG, as those players can easily shell out another $10M here and there to be compliant with this regulatory monster.
> Nor can they deny access to their services to users who refuse to opt-in to tracking.[1]
Taken literally this means it's illegal to provide a service in exchange for tracking. Can someone elaborate on whether this is true and what else it applies to or what else other business models are made outright illegal?
The author believes that users have little incentive to allow Google to provide personalized Google Search results.
I don't think any technical-oriented people in this thread would agree that they have "little incentive" to allow Google Search personalization. When I turn off Google Search personalization, I get inferior search results that are less likely to be what I was searching for.
If you don't want your results personalized, there is an option in the search results to turn personalization off.
The problem I have with this law is that Google will need to default to non-personalized results and then prompt users if they want personalization. Google probably doesn't want to increase UI friction, so they will most likely just disable personalization and not prompt to enable. This will result in less-engaged users and inferior search results for the average EU citizen.
I'm going to read thoroughly through the terms and conditions and get a case going in European Courts when this comes into play, because you know for a FACT Google and Facebook will put in some vague term to let them collect data for "future" improvement of the service. Watch and see.
> “A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”
We [technology dept at a non-computer business in the UK] got the lecture about this at work. Turns out geeks are fans of this approach!
I've been using "GDPR hazard" as a useful way to kill bad ideas at work. "Sure you can do that! We just need you to confirm that your business unit accepts responsibility for this user-identifying data and ... oh, we can delete it? I'll do that now then."
We have lots of user-identified data, going back years. I can't see it as a bad thing for us to behave properly with regard to it, and to be required to do so.
What problems? So you can't collect all the data on your costumers and do with it as you see fit? And when it gets leaked just say: "Oops, sorry"? If for some startups taking private data seriously is a "problem", I want to see them burn in fire.
The more I think about it, the more I feel that if you're a startup, thinking about this from the get-go won't hurt you and won't really cost you.
What I mean by that is, it's easier to build your db and backups to comply with these laws before you have anything set in stone, than after you have any meaningful amount of personal data. Like, if you organise your backups and db to happily be able to handle removal of requested data before you accrue too much technical debt/inertia then you're going to be ahead of anyone who has to retrofit, which in many ways actually puts you at an advantage.
Also, I for one won't be mourning the loss of the business model that parasitically lives of exploiting user data.
So, how long until this one also also gets neutered when European governments realize they can't even bring their own websites into compliance with the new law? ;)
> "Nor can they deny access to their services to users who refuse to opt-in to tracking.[1]"
> "[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1. See Recital 42’s reference to “without detriment”, Recital 43’s discussion of “freely given” consent, and Article 7(2) prohibition of conditionality. See also the UK Information Commissioner’s Office’s draft guidance on consent, 31 March 2017, p. 21, which clearly prohibits so-called “tracking walls”."
What, in this regulation, prevents the company from denying users (who opt out) access to a service they provide free of charge or a downgraded experience? And how would a court measure the level of service?
Would one way "around" this be, "Welcome to ABC Service, it costs X€/month to use, or if you allow us to use your data to sell to our advertisers we will waive this fee."?
How might one make sure their accounts get classified those that fall under the scope of the GDPR regulation? Would it be sufficient to set your location to an EU country?
I have been thinking that the ad bubble is a problem for a while now. Ads are basically to encourage consumption, and the US economy has been debt based for decades now.
Throwaway since I don't want to involve my employer.
I actually work for a platform that is squarely in the GDPR crosshairs (digital marketing). There are a lot of things where our lawyers' perspective is different from what most people say here (I didn't talk directly to lawyers, but I presume product managers did).
- You don't have to comply in 2018, you have to show that you started seriously working on a solution, even if you're not fully prepared.
- You don't have to have automated processes for everything (e.g. delete from backups), it's actually perfectly reasonable to say "we'll process your request" and do it manually (ref: startups spending inordinate amounts of effort for GDPR compliance).
- Opt-in is not as "game changer" as suggested here, my understanding is that you can do implicit consent (notify the user about what you do, give them a link to take action; crucially, that link might even be the link to your privacy policy which contains the link to the opt-out interface... if I got this right - and I think that I did - this may not amount to much more than a slightly modified "this site uses cookies" thingy).
- Delete requests may be handled by "de-identification" (don't delete the data, delete the association with you).
- Related to that, while I don't have a definitive answer, I strongly suspect that GDPR only applies to information that can be positively associated with you (e.g. authenticated activity). I'm not obliged to show you anonymous browser activity/information that I've probabilistically associated with you, for the simple reason that I might be wrong and I might disclose sensitive information (think about girlfriend looking up "what does Amazon know about me" and finding up that "she is interested in an engagement ring" because you anonymously browsed from her computer, thus spoiling your surprise even though you were careful to delete your browser history/ browse anonymously. Yes, incognito mode doesn't necessarily help you - we do efforts to identify server-side the incognito sessions and de-link them from the probabilistic marketing profiles, because we don't want to negatively-surprise the customers; but I suspect not all players are that careful).
Overall... despite what many people think, I think big players are actually fairly careful/sensitive about your privacy (well, if we exclude Facebook here :D ). It's the startups that would concern me more... they have very little incentive to guard your data well, because there are so many OTHER reasons why they might fail, that "privacy disaster" is very low on their list of concerns.
Are all companies beholden to this or those with legal entities in Europe.
For instance, can a Chinese company with ZERO legal presence in the EU completely ignore these requirements? The internet has no real borders, after-all.
[+] [-] shadowtree|8 years ago|reply
It is pretty simple, only 3 levels (strikes for the fellow Americans):
Strike 1 - Stern warning letter
Strike 2 - 2% of your TOTAL GLOBAL REVENUE
Strike 3 - 4% of your TOTAL GLOBAL REVENUE (or 20mil EUR, whichever is higher)
And now you know why GDPR is a board level topic. Keep in mind that the EU/US Safe Harbor agreement got axed due to a lawsuit of a single student from Vienna against Facebook. So all you need is a single pissed off German customer you ignored when asking for their data report card and you're fucked.
For startups - GDPR is like Y2K at the time, a GOLDMINE. So much opportunity to sell solutions, from real to snake oil. GDPR compliance is already and will continue to trigger a massive wave of investment.
Enjoy :)
[+] [-] throwaway_9123|8 years ago|reply
I have national sales responsibilities for one of the majors. Think IBM/Microsoft/Oracle/etc leading a sales team of 74 reps.
You'd be surprised at how LITTLE sales we've generated from GDPR. We've been providing free GDPR assessments for the past 1.5 years for over 200 accounts as lead gen opportunity and very little sales have resulted.
It all boils down to companies simply don't believe the fines will be enforced given just how expensive the fines are.
And since GDPR doesn't go into affect until May 2018, companies are just waiting and seeing what happens.
It's really hard to sell GDPR because it's essentially an insurance policy. Why spend $5m on software and another $5m in services ($10m combined) if your total fine is only $20m. Do you as a company have a 50% chance of getting fined? If not, then roll the dice and not buy a solution.
[+] [-] mnm1|8 years ago|reply
[+] [-] grabeh|8 years ago|reply
Art 83 covers the different triggers.
Having said that there are various schools of thought around levels of potential fines, including from different data protection authorities in the EU.
Considering the ICO in the UK is yet to levy a maximum fine despite egregious violations is at least one factor to suggest fines will not increase dramatically.
Also there are not only increased fines to consider but an increased focus on compensation to data subjects in the event of a violation of their rights (together with an evolving case law to support that in the UK at least).
[+] [-] AndrewKemendo|8 years ago|reply
In reality a bunch of small shops are going to go bankrupt because they don't have a "GDPR implementation" position filled and they didn't do some report properly.
[+] [-] Silhouette|8 years ago|reply
Unless you're a start-up that handles personal data, in which case it's another bureaucratic overhead that also carries a risk of draconian penalties if you make a mistake, even if you have perfectly sensible reasons for working with that data and you're not doing anything at all surprising or dubious with it.
Of course, the EU has form for this, given its similar approach to both consumer protection and VAT rules in recent years. It does seem to have an unhelpful habit of imposing regulations at big business scale to deal with big business scale problems, but not considering that both of these may be wildly disproportionate for smaller businesses.
[+] [-] spyspy|8 years ago|reply
[+] [-] cm2187|8 years ago|reply
Almost every US supreme court decision come from a single guy challenging something. Are you suggesting that under a certain size, one shouldn't be allowed to sue in court?
[+] [-] pascalxus|8 years ago|reply
I predict, lawyers are going to make a lot of money on this.
[+] [-] mdip|8 years ago|reply
I take issue with the quoted statement because it ignores the downside for startups[0]. Companies like Google, Facebook et. al., have the money and time to hire teams of lawyers to find a way to work around these regulations in a manner that maximises their ability to continue tracking while minimising their risk in getting smacked by the hand of the law. Getting hauled off to court won't bankrupt them and they have the legal teams to probably win regularly enough. Even barring that, they have the finances to adjust their business to be fully compliant (in whatever degree business adjustments require) without going bankrupt.
Joe's Advertising Supported Free Service does not. Joe's not going to start his own ad network and start mining personal data for it -- it's way too expensive to try to compete with Google/Facebook (and it was already way too expensive to do it, before). If he does, he's the one who's going to get hit with the second and third strike; probably from that "single pissed off German customer".
Investing in firms that touch this space will be met with far more skepticism. I wouldn't be surprised if any company that simply asks for a user ID and password won't face a little scrutiny from investors, at least until the regulatory atmosphere is understood (I doubt it'll be that extreme for terribly long, but one bad court ruling/fine laid out where it wasn't expected could change that). The cost of establishing many, many kinds of companies will now increase because the risks are high enough that going to market without having your legal bases covered on this one. That money has now been shifted to a business who -- potentially -- is selling snake-oil (and startups are going to be more likely to do business with that snake-oil salesman since they'll probably also be the least expensive).
Then there's the "unintended consequences". Here's a crazy hypothetical, but a lesser variation of it is plausible if this were a US law: Some individual exercises his free-speech rights and chucks something up on the Internet that has a bunch of horrible things on it, say, like 'a guide on how to slaughter and prepare kittens for healthy and inexpensive dinners'. Some kid reads it and kills/eats his neighbor's cat. The guy didn't do anything illegal, really, but his web host knocks him off the web and people are calling for blood. He happens to use an ad-network, but doesn't, himself, collect personal information. However, this ad-network does, and at one point was nailed under this law. He uses the ad-network, so some overzealous prosecutor figures out a way to bring it in front of a judge that he's responsible for what this third-party did and should also be prosecuted. At the height of outrage, a jury isn't hard to find to connect the dots[1].
[0] And hey, that's fine, you're an internet commenting individual just like me -- we don't have to present both sides of the story -- that's the replier's job.
[1] Yeah, I took that a little far, but I think back to when "The Columbine Massacre" happened and everyone believed it was FPS video games that warped those evil children's "precious little minds". It took all of two seconds to call for banning violent video games (constitution be dammed), many idiotic and ultimately overturned laws were passed, and if there was some way to haul the developers who wrote the game off to jail (I think they were blaming DOOM at the time), it would have been possible within those first few weeks.
[+] [-] 77pt77|8 years ago|reply
20mil EUR even for zero revenue?!
[+] [-] taysic|8 years ago|reply
[+] [-] xd|8 years ago|reply
Why would they impose a 20M limit and not stick to 4% revenue irregardless of it...
edit: I'm not sure that the down voting is about.. it's still a limit, a limit that means a company turning over 0-500M will pay up to a 20M fine.. not so bad the closer you get to 500M but not so great if you're a small company, especially so as the regulation is so open to the "law of unintended consequences" right now and only larger companies will have the funds / man power to navigate it.
[+] [-] AriaMinaei|8 years ago|reply
I've just started a business myself, and this regulation affects my company too. It makes development costlier; it'll take from the precious little time we have to spend on compliance paperwork rather than work on our core business. In the short run, it does hurt our chances of success.
Yet, none of the trouble is even comparable to what's to be gained here. And it bothers me (though doesn't surprise me) that some people don't see that.
It also bothers me that such vocal opposition barely comes up when the discussion is just about bigger companies such as Google and Facebook. How can we expect "un-evilness" from bigger companies when we're barely willing to do anything in that regard ourselves?
[+] [-] closeparen|8 years ago|reply
The spirit of the law is nonsensical. It makes all commercial activity illegal, to the extent that all businesses keep records of their sales, inventory, etc. which reflect the activities of their customers, employees, and suppliers.
[+] [-] fixermark|8 years ago|reply
Google's current ecosystem of data-sharing means that Assistant can make educated context guesses on what I mean when I talk to it based on my browser history and map navigation history. If the new privacy constraints damage that passive interconnection, that's not a net good for me.
[+] [-] antoncohen|8 years ago|reply
Take the right to be forgotten. First of all, it should be common sense that no one has the right to force legitimate news articles to disappear because they don't like the content, but that is what the EU has ruled should happen.
I get the desire to have a company forget about you, and remove all the personal information they have. It makes sense from a personal standpoint. But how do you do it technically?
If you follow GDPR strictly you would need to be able to purge the data from your backups. Now most backups are considered immutable, so you aren't going to do that, meaning you need a way to ensure that "forgotten" users never get restored.
But how do you even delete the live data? Does the tech company you work for have the ability to delete all traces of a user from their system, cleaning severing all relationships with other objects in your system? Do you have the ability to retrieve everything you know about a specific user, and provide it to them? You will need to write the code to do this.
There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
Do you sell a SaaS service to businesses, and those businesses send you their customer's data? Then you are the processor and they are the controller. Cool, less for you to do, sort of. Except that controller must agree to every sub-processor you use. Want to switch from AWS to GCP? You can only do it if all your customers agree. Want to use try out a new metrics or logging service? If it will have any PII you can't do it without customer (controller) permission.
You will basically need to hire full-time compliance officers to deal with this. The big tech companies already have compliance officers, but GDPR is so massively invasive to businesses that even small companies now need compliance officers.
[+] [-] Slartie|8 years ago|reply
No, it is about deleting personal data attached to your user account, not "news articles". This thing intends to make the "delete my account" button to actually, you know, "delete my account", instead of fake-deleting it by setting a "deleted" flag and telling me that everything is gone now while still keeping gigabytes of data associated with me in your database.
> [...] meaning you need a way to ensure that "forgotten" users never get restored.
If this is considered to be a hard problem, then I assume storing some list of deleted users in a separate place and immediately purge those users from the backup after restore must be some kind of rocket science.
> There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
I wouldn't call it "to become GDPR compliant", I would call it "to build a sound database structure". Because if you are unable to purge all data associated to one of your users' accounts from your system without destroying the integrity of the rest of your data, then you obviously have a half-baked system at your hands that lacks a core feature - to actually delete accounts. And you surely should spend some of your money to refactor this crap into a long-term viable solution while you are still small and agile enough to do that. Because it's only going to be way more expensive later on...
[+] [-] nvarsj|8 years ago|reply
Doesn't it just affect companies which rely heavily on lack of privacy for monetisation? I think that's sort of the point - that your business should not rely on tracking individuals and selling that information without their consent to gov/private bodies. It's obviously a huge change, since so many big tech players rely on this to make profits. But the internet will be a much nicer place for everyone else if right to privacy is protected.
[+] [-] Joeri|8 years ago|reply
And as far as this stuff being difficult to do, sure, but isn't it worth doing? Why shouldn't a customer have a say which cloud provider hosts their data? Why shouldn't we be able to make sure no data is kept about us after we stop using a service? Like with anything novel in software it only seems hard to do because we haven't done it, but in a ground up design it's not that hard to add gdpr compliance, and a few years down the line this stuff will be business as usual.
You don't need a compliance officer, but you do need a security officer, and their job now also involves data lineage, not just data security. You already should have that person if you're building a SaaS solution.
[+] [-] A_No_Name_Mouse|8 years ago|reply
That is not correct. The right to privacy is not an absolute right. It has to be balanced against other rights, such as the right to free press. In a normal news article case, free press would prevail.
> There is a good chance your little startup that isn't cash flow positive will have to spend $1 million of its VC money on becoming GDPR compliant.
I advise a lot of small customers to implement manual procedures to retrieve or delete data in case a request for it might be done. And to set up a basic privacy and security policy which they should have had already. This doesn't cost much.
> Except that controller must agree to every sub-processor you use.
This can be a generic agreement where the processor notifies the processor.
> Want to switch from AWS to GCP? You can only do it if all your customers agree.
Not true, you do however need to be able to tell customers what companies receive their data. Which can be quite a challenge with sub-sub-subcontractors.
Want to use try out a new metrics or logging service? If it will have any PII you can't do it without customer (controller) permission.
Not true if the processing agreement contains a clause that instructs processor to perform metrics or logging. Customer consent is often not needed unless it has big impact on their privacy. Consent is only one of the legal grounds.
> You will basically need to hire full-time compliance officers to deal with this. The big tech companies already have compliance officers, but GDPR is so massively invasive to businesses that even small companies now need compliance officers.
If this were true I'd be a lot busier. It would be wise if companies assign the responsibility for privacy and security, but it doesn't always need to be a full time job with a level background.
[+] [-] bjl|8 years ago|reply
Or they could, I don't know, just not collect that data in the first place.
[+] [-] ubernostrum|8 years ago|reply
I think a lot of EU people (I'm not one) would disagree with you here. The notions of privacy and of the goals of the criminal-justice system in several parts of Europe are radically different from the notions your "common sense" position is based on, which means that what seems "common sense" to them seems ludicrous to you, and vice-versa.
[+] [-] _pdp_|8 years ago|reply
So here is how to avoid the GDPR penalties.
1. Get compliant - it is pretty much ISO27001 and it will cost you money 2. Don't collect excessive PII data and if you do, store it securely - after all it is a very basic ask 3. Avoid collecting PII data at all cost - think of it as another form of PCI
Frankly, there is no need to panic.
[+] [-] CobrastanJorji|8 years ago|reply
Yes, users will click yes on basically anything. Facebook could put up a message that says "In order to proceed, click yes to give us half the money in your checking account" and the majority of Facebook users will still click through. Look at EU cookie warnings. Did any of those warnings noticeably impact anybody's traffic after the first week?
[+] [-] mattmanser|8 years ago|reply
[+] [-] Silhouette|8 years ago|reply
No, they won't. When the EU imposed new consumer protection rules not so long ago, it resulted in having to put some scary-looking legalese directly on your sales funnel pages if you were supplying digital content, even if said legalese was of no practical value to anyone including your customer. That alone was enough to hurt conversions, even if you didn't require something like a token checkbox to be ticked before continuing. The GDPR compliance requirements are potentially on an entirely different scale.
[+] [-] laksjd|8 years ago|reply
The talk listed all the possible ways the law allows you to store/manipulate user data without requiring explicit consent... There are a shocking number and iirc they apply basically whenever you have a direct consumer relationship with some company.
[+] [-] dingo_bat|8 years ago|reply
In other words I know that clicking on a Facebook dialog box saying "you agree to give us 50% of your income" is meaningless and so I will click on it and use the website.
[+] [-] hohenheim|8 years ago|reply
GDPR prevents the companies from discontinuing service for users whom wish not to be tracked.
[+] [-] bflesch|8 years ago|reply
This prohibition of freely using all available data will create great arbitrage opportunity for the shadow economy, and will have a net negative effect on innovation.
I think prohibition has very bad side effects, and that MORE transparency is the way forward in politics, economy, and also society. This includes allowing businesses to use all the data they can get their hands on. People can produce infinitely more data than any google can realistically process.
I cannot understand why people who are otherwise for transparency and against prohibition are celebrating this as a big win against FB/AMZ/GOOG, as those players can easily shell out another $10M here and there to be compliant with this regulatory monster.
[+] [-] danarmak|8 years ago|reply
Taken literally this means it's illegal to provide a service in exchange for tracking. Can someone elaborate on whether this is true and what else it applies to or what else other business models are made outright illegal?
[+] [-] Sephr|8 years ago|reply
I don't think any technical-oriented people in this thread would agree that they have "little incentive" to allow Google Search personalization. When I turn off Google Search personalization, I get inferior search results that are less likely to be what I was searching for.
If you don't want your results personalized, there is an option in the search results to turn personalization off.
The problem I have with this law is that Google will need to default to non-personalized results and then prompt users if they want personalization. Google probably doesn't want to increase UI friction, so they will most likely just disable personalization and not prompt to enable. This will result in less-engaged users and inferior search results for the average EU citizen.
[+] [-] AJRF|8 years ago|reply
[+] [-] phatbyte|8 years ago|reply
> “A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”
[+] [-] kbart|8 years ago|reply
[+] [-] raimue|8 years ago|reply
[+] [-] hedora|8 years ago|reply
[+] [-] davidgerard|8 years ago|reply
I've been using "GDPR hazard" as a useful way to kill bad ideas at work. "Sure you can do that! We just need you to confirm that your business unit accepts responsibility for this user-identifying data and ... oh, we can delete it? I'll do that now then."
We have lots of user-identified data, going back years. I can't see it as a bad thing for us to behave properly with regard to it, and to be required to do so.
[+] [-] 0x27081990|8 years ago|reply
[+] [-] kbart|8 years ago|reply
[+] [-] FridgeSeal|8 years ago|reply
What I mean by that is, it's easier to build your db and backups to comply with these laws before you have anything set in stone, than after you have any meaningful amount of personal data. Like, if you organise your backups and db to happily be able to handle removal of requested data before you accrue too much technical debt/inertia then you're going to be ahead of anyone who has to retrofit, which in many ways actually puts you at an advantage.
Also, I for one won't be mourning the loss of the business model that parasitically lives of exploiting user data.
[+] [-] goodplay|8 years ago|reply
[+] [-] fixermark|8 years ago|reply
[+] [-] mzzter|8 years ago|reply
> "Nor can they deny access to their services to users who refuse to opt-in to tracking.[1]"
> "[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1. See Recital 42’s reference to “without detriment”, Recital 43’s discussion of “freely given” consent, and Article 7(2) prohibition of conditionality. See also the UK Information Commissioner’s Office’s draft guidance on consent, 31 March 2017, p. 21, which clearly prohibits so-called “tracking walls”."
What, in this regulation, prevents the company from denying users (who opt out) access to a service they provide free of charge or a downgraded experience? And how would a court measure the level of service?
[+] [-] richrichardsson|8 years ago|reply
[+] [-] Chaebixi|8 years ago|reply
[+] [-] yuhong|8 years ago|reply
[+] [-] throwaway91234|8 years ago|reply
I actually work for a platform that is squarely in the GDPR crosshairs (digital marketing). There are a lot of things where our lawyers' perspective is different from what most people say here (I didn't talk directly to lawyers, but I presume product managers did).
- You don't have to comply in 2018, you have to show that you started seriously working on a solution, even if you're not fully prepared. - You don't have to have automated processes for everything (e.g. delete from backups), it's actually perfectly reasonable to say "we'll process your request" and do it manually (ref: startups spending inordinate amounts of effort for GDPR compliance). - Opt-in is not as "game changer" as suggested here, my understanding is that you can do implicit consent (notify the user about what you do, give them a link to take action; crucially, that link might even be the link to your privacy policy which contains the link to the opt-out interface... if I got this right - and I think that I did - this may not amount to much more than a slightly modified "this site uses cookies" thingy). - Delete requests may be handled by "de-identification" (don't delete the data, delete the association with you). - Related to that, while I don't have a definitive answer, I strongly suspect that GDPR only applies to information that can be positively associated with you (e.g. authenticated activity). I'm not obliged to show you anonymous browser activity/information that I've probabilistically associated with you, for the simple reason that I might be wrong and I might disclose sensitive information (think about girlfriend looking up "what does Amazon know about me" and finding up that "she is interested in an engagement ring" because you anonymously browsed from her computer, thus spoiling your surprise even though you were careful to delete your browser history/ browse anonymously. Yes, incognito mode doesn't necessarily help you - we do efforts to identify server-side the incognito sessions and de-link them from the probabilistic marketing profiles, because we don't want to negatively-surprise the customers; but I suspect not all players are that careful).
Overall... despite what many people think, I think big players are actually fairly careful/sensitive about your privacy (well, if we exclude Facebook here :D ). It's the startups that would concern me more... they have very little incentive to guard your data well, because there are so many OTHER reasons why they might fail, that "privacy disaster" is very low on their list of concerns.
[+] [-] mankash666|8 years ago|reply
For instance, can a Chinese company with ZERO legal presence in the EU completely ignore these requirements? The internet has no real borders, after-all.