It's important to note they aren't creating laws against infinite scrolling, but are ruling against addictive design and pointing to infinite scrolling as an example of it. The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes". They point to certain features they'd like them to change, but there is no specific ruling around what you can/can't do.
My initial reaction was that this was a terrible precedent, but after thinking on it more I asked myself, "well what specific laws would I write to combat addictive design?". Everything I thought of would have some way or workaround that could be found, and equally would have terrible consequences on situations where this is actually quite valuable. IE if you disallow infinite scrolling, what page sizes are allowed? Can I just have a page of 10,000 elements that lazy load?
Regardless of your take around whether this is EU overreach, I'm glad they're not implementing strict laws around what you can/can't do - there are valuable situations for these UI patterns, even if in combination they can create addictive experiences. Still, I do think that overregulation here will lead to services being fractured. I was writing about this earlier this morning (https://news.ycombinator.com/item?id=47005367), but the regulated friction of major platforms (ie discord w/ ID laws) is on a collision course with the ease of vibe coding up your own. When that happens, these comissions are going to need to think long and hard around having a few large companies to watch over is better than millions of small micro-niche ones.
>"well what specific laws would I write to combat addictive design?"
Hear me out: banning advertising on the Internet. It's the only way. It's the primordial domino tile. You knock that one over, every other tile follows suit. It's the mother of chain reactions. There would be no social media, no Internet as we know it. Imagine having TikTok, YouTube or X trying to survive on subscriptions alone in their current iterations. Impossible. They'd need to change their top priority from "maximizing engagement by fostering addictive behavior" to "offering a product with enough quality for someone to pay a fee in order to be able to use it".
Isn't this the standard EU way? First, they publish a statement, declaring what they want to see. 'Deal with addictive design', in this case. We've had 'Deal with the zillion different connectors on cell phones' in the past. It is now up to the industry to do this, in whatever way they see fit. If this happens, no law will be written. However, if the industry doesn't deal with it adequately, Laws will Follow, and the industry will not like them.
European companies know this pattern, and tend to get the hint. US companies tend to try and maximize what they can get while claiming there is no law against it, then go very pikachu-faced when the consequences hit them.
>The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes"
This is not such an unusual thing in law, as much as us stem-brained people want legal systems to work like code. The most famous example is determining art vs pornography - "I know it when I see it" (https://en.wikipedia.org/wiki/I_know_it_when_I_see_it)
> The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes". They point to certain features they'd like them to change, but there is no specific ruling around what you can/can't do.
The issue is: If you do a precise wording of what you don't want a lawyer will go through it wird by word and the company finds a way to build something which violated the spirit, but not the exact wording. By being more generic in the wording they can reach such cases and future development with very little oversight for later corrections and courts can interpret the intention and current state of art.
There are areas where law has to be precise (calculation of tax, criteria for criminal offenses, permissions for authorities, ...), but in many cases good laws are just as precise as needed and as flexible as possible.
I thought about it for only a few seconds, but here is one way to do it. Have users self-report an "addiction factor", then fine the company based on the aggregate score using a progressive scale.
There is obviously a lot of detail to work out here-- which specific question do you ask users, who administers the survey, what function do you use scale the fines, etc. But this would force the companies to pay for the addiction externality without prescribing any specific feature changes they'd need to make.
The thing is, I doubt anyone at TikTok ever says "this design choice is good because it's addictive". Almost certainly, their leadership gives them metrics to target, like watch time, and they just hypyothesise and experiment on changes with those metrics in mind. Almost certainly the design of TikTok is almost entirely emergent. Just like the scientific method is "revealing" truth I think TikTok is just "revealing" the design that maximises its target metrics.
So what we have is a machine designed to optimise for something adjacent to addictiveness, and then some rules saying "you can't design for addictiveness"...
What happens when an underspecified vibe rule clashes with a billion dollar optimisation machine? Surely the machine wins every time? The machine is already defeating every ruleset that it's ever come up against.
Feels like the only way regulation could achieve anything is if it said "you can't build a billion dollar optimisation machine at all".
France is considering a ban on certain social media for minors, and parental consent on all social media for minors under 15, pretty much like Australia. They had work around EU laws that prevented them to force service providers to do things, the trick they used is to make it illegal for those services to let minors register on the platform, because EU law acknowledge that local laws on forbidden content apply.
If this law passes and they "blacklist" some of these design-for-addiction (sorry, "engagement") platforms, I believe it should send a strong signal for adults as well. Most adults are pretty much aware that these platforms are bad for everyone; according to some polls, the public opinion is unambiguously in favor of these laws.
> "well what specific laws would I write to combat addictive design?"
You can't. You don't need to specify how to comply with the law, just that generally a goal must be met. That's good lawmaking there, since it's flexible enough to catch all future potential creatives way to break it. I remember someone comment about how working at MSFT as a compliance officer, dude was going around saying that it's not the letter of the law that must be followed, but the spirit of thereof. They rolled over him and released the product nonetheless. Almost immediately came the EU investigation and that crap had to be reversed an put in accordance to what the stated goal of the law is.
> The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes".
That's not really accurate. The EU actually legislated in a way which is very typical of how countries regulate things which are now to carry hard to characterized and varied risks.
Companies have to carry out a risk assessment and take appropriate preventive actions when they find something. The EU audits the assessment. That's how finance has been regulated for ages.
If a company chooses a design and it can be proved through a subpoena of their communications that the design was intended and chosen for its addictive traits, even if there has been no evidence collected for the addictiveness, then the company (or person) can be deemed to have created a design in bad faith to society and penalized for it.
(Well that's my attempt. I tried to apply "innocent until proven guilty" here.)
> My initial reaction was that this was a terrible precedent, but after thinking on it more I asked myself, "well what specific laws would I write to combat addictive design?". Everything I thought of would have some way or workaround that could be found...
This doesn't solve the problem though - the enforcers still have to come up with a standard that they will enforce. A line has to be drawn, letting people move the line around based on how they feel today doesn't help. Making the standard uncertain just creates opportunities for corruption and unfairness. I haven't read the actual EU stance on the matter but what you are describing is a reliable way to end up in a soup of bad policy. There needs to be specific rulings on what people can and can't do.
If you can't identify the problem, then you aren't in a position to solve it. Applies to most things. Regulation by vibe-checks is a great way to kill off growth and change - which the EU might think is clever, but the experience over the last few centuries has been that growth and change generally make things better.
And what they actually seem to be doing here is demanding that sites spy on their users and understand their browsing habits which does seem like a terrible approach. I don't see how their demands in that statement align with the idea of the EU promoting digital privacy.
What will probably happen is that someone will develop an industry standard for "non-addictive design" and go around certifying products or product development practices. Like for example, they might disallow optimizing time spent, or they might require more transparency or customizability for your recommendation algorithm.
Breaking infinite scrolling on these apps is one good step, but for me it's something else that would be more important.
I'm recovering from a surgery and can't do much besides existing. So I'd like to scroll to keep me occupied and numb the pain in my face.
But instagram tries to shove content down my throat that I don't want to see. It's always only a matter of time until I see THOT/incel content. No matter how often I click "not interested", they try again and again. If it's not playing genders out against each other, it's politics. It's brain rot. I don't wanna see that. I have interests and they know what they are. But no, they show me this garbage. The algorithms need to be the second thing we need to regulate imho
I remember the GDPRpocalypse which had a lot of Americans up in arms because of the wildly different approach to lawmaking that the EU has. Everyone on the US side was screaming for a checklist they could implement, and assumed they would get maximum penalties if they didn't cross every t and dot every i. But it just doesn't work like that, EU laws are generally not very procedural, they are a lot more about intent.
These findings are very much in line with that, they bring up a feature, a checkbox, a specific thing TikTok did to pay lip-service to protect minors, and then they're simply saying that it doesn't appear to work. So it doesn't matter that TikTok checked the box and crossed the t.
Assuming it was "just" about banning infinite scrolling. Not saying it is a good idea, but right now I cannot think of a legitimate use case where you would need it, unless your goal is engagement.
In the US we often use a "reasonable person" standard to get around trying to write super precise descriptions of things. "don't do X where a reasonable person would think Y."
>My initial reaction was that this was a terrible precedent, but after thinking on it more I asked myself, "well what specific laws would I write to combat addictive design?".
I'd make the algorithms transparent, then attack clearly unethical methods on a case by case basis. The big thing about facebook in the 2010's was how we weren't aware of how deep its tracking was. When revealed and delved into, it lead to GDRP.
I feel that's the only precision method of keeping thins ethical.
3 hrs a day on your phone is equivalent to 15 years of your life (accounting for a 16 hour waking day). I know people that do a solid 6... That's 30 years of their life scrolling, getting their brains completely fried by social media, and soon the infinite jest machine that is generative AI.
Sorry, we don't let people fry their brains with drugs, well we at least try to introduce some societal friction in between users and the act of obtaining said drug.
Many laws work like that. They don't have very precise definitions of things, but instead depend on what an average, reasonable person would think.
An example of this is contract law. There is no clear definition of what a legal contract must look like. Instead, a contract's validity can depend on whether an average, prudent person would have entered into it in similar circumstances.
This is a classic play book by anyone who is anti regulation. Present it as something that appears to be ludicrous - eg “they are banning infinite scroll!” and rely on the fact that very few people will actually dig any deeper as you’ve already satisfied their need for a bit of rage.
No, this is far worse. This is just a license for bureaucrats to selectively choose winners or losers in social media. Once regulatory capture happens it merely turns into a special privilehe for pre-established businesses or a vehicle for one business to destroy another without outcompeting it
> I asked myself, "well what specific laws would I write to combat addictive design?".
Only allowing algorithmic feeds/recommendations on dedicated subpages to which the user has to navigate, and which are not allowed to integrate viewing the content would be an excellent start IMO.
In the USA at least, we need a nation specific intranet where everyone on it is verified citizens and businesses where the government cant buy your data but instead is tasked with protecting it, first and foremost, from itself.
No more for profit nets. Time for civil digital infrastructure.
> The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes". They point to certain features they'd like them to change, but there is no specific ruling around what you can/can't do.
> "well what specific laws would I write to combat addictive design?"
Expand the GDPR "Right to data portability" to publicly published content for third parties, i.e. open up the protocols so you can have third party clients that themselves can decide how they want to present the data. And add a realtime requirement, since at the moment companies still circumvent the original rule with a "only once every 30 days" limit.
Also add an <advertisment> HTML tag and HTTP header and force companies to declare all their ads in a proper machine readable way.
The core problem with addictive design isn't the addictive design itself, but that it's often the only way to even access the data. And when it comes to communication services that benefit from network effects, that should simply not be allowed.
> It's important to note they aren't creating laws against infinite scrolling, but are ruling against addictive design and pointing to infinite scrolling as an example of it.
If the EU passes a law that seems general but start giving out specific examples ahead of time, they’re outlawing those specific examples. That’s how they work, even if you read the law closely and comply with the letter of the law. And they’ll take a percentage of your global revenue while people shout “malicious compliance” in the virtual streets if they don’t get their way.
I hope this goes through. Trillion dollar companies are waging a war on our attention, using everything at their disposal to make these apps addictive. It isn't a fair fight and the existence of infinite feeds is bad both for people and democracy. Regulating consumer products that cause harm to millions is nothing new.
As a person working at social media I support this as well. I'm a hypocrite. I admit, but the pay is too good to find alternative.
Terms like "DAU" or "engagement" is common in our field and the primary objective is how to make users spend more time on our platform. We don't take safety or mental health seriously internally but only externally for PR reasons.
CEOs won't change that because the more time user spends on the platform, the more ad revenue it brings.
I installed a Firefox plugin that makes YouTube shorts display as normal videos. I was genuinely shocked how much of a difference it made to my habits.
> Trillion dollar companies are waging a war on our attention, using everything at their disposal to make these apps addictive.
Or you could just shut the phone off and/or not install the app. It's a simple solution, really, and one that is available at your disposal today at no cost.
It's interesting how there may be an implicit assumption that imposing more rules on tech will lead to positive outcomes. From my perspective, technology is like reality itself: very difficult to control, with countless ways to achieve the desired result while circumventing the rules. And what's the actual result? Just look at the market capitalization of European companies compared to US companies...
Or maybe it just feels good to add new rules and engage in virtue-signaling contests. Or maybe it's just a way to make everything illegal—'find me the person, and I'll find you the crime' type of control. Maybe a combination of all those. Who knows?
From my experience, the farther you get from the influence of bureaucrats, the happier life becomes...
The regulatory frameworks in the EU are intentionally not designed like the US, to maximize company profits over e.g. human rights and health.
It is thoroughly documented that social media and the modern web are designed to be addictive, by psychologists who specialize in this. We regulate access to other addictive things, because addictive things break humans' normal control systems.
> "the farther you get from the influence of bureaucrats, the happier life becomes"
only when things are "normal" and if you're a default power-holder in a community. For everyone else, really no.
> Just look at the market capitalization of European companies compared to US companies...
A huge portion of that market cap exists only because the companies in question are allowed to act unethically. Aside from that, all this wealth is concentrated in the hands of a small minority.
Ultimately the economy exists to serve us, not the other way around. What good all that market cap does for the average American?
Indeed the cat and mouse game is tedious... There's a case to be made that you should just act on the root cause of all these issues with a neutral policy tool. The best tested of all regulatory tools is taxation. Reduce the profit motive slightly and many of these aberrations nobody likes will go away.
Aren't you happy that when you buy food, it doesn't contain cocaine? Regulations are totally necessary and addictive online social media is a well documented plague in our youths especially.
This very US lobbyists narrative that Europe regulate while missing out on the economy is used and abused anytime something look like contrary to US interests in MAGA land.
> And what's the actual result? Just look at the market capitalization of European companies compared to US companies..
Europe is actually doing quite well at the moment. The European stock markets have over-performed quite decently vs. the US ever since Trump became president, despite the various curveballs thrown at Europe in recent years. Market capitalisation in the US is held up primarily by the Magnificent 7 who are great outliers in the American stock market.
Personally I don't like infinite scrolling - but the EU also needs to stop wasting time and energy on trying to micro-manage everything. We already had that disaster where pop-ups fly out "do you want to accept those cookies". That is just a usability nightmare. People are forced into extensions, just to stop wasting their time here.
Also, while I dislike infinite scrolling, why should the EU regulate the design of a website? I don't like this idea as a principle. This clearly comes from overpaid bureaucrats. I am not at all saying the EU should not become stronger, in the face of a very hostile and abusive USA - but the focus by these bureaucrats is wrong. Those micro-regulations will not get rid of US dominance in the software sector.
> We already had that disaster where pop-ups fly out "do you want to accept those cookies". That is just a usability nightmare. People are forced into extensions, just to stop wasting their time here.
You’re perpetuating a gross misunderstanding of the cookie law. What it states is different from how the advertisers implement malicious compliance to bias people, like yourself.
Websites that implement basic functional cookies do not need to display any popups. They’re permitted to do so. Any cookies that are essential to the functioning of the website within reason are permitted. In fact at no point a website should serve you a cookie popup unless you seek it out because analytics and advertising cookies are supposed to be opt in.
So many websites do two things, serve you a popup that has everything enabled which is a clear violation; or a popup that has only functional cookies selected but the biggest highlighted button allows all of them.
The law is fine. Malicious compliance is to blame. The EU has been slow to rectify it.
The Commission isn't (at least from what's published) saying "pagination must look like X" or "buttons must be Y pixels tall." They're saying platforms have to mitigate systemic risks tied to their design choices. That's more like regulating outcomes and incentives than dictating UI details
> why should the EU regulate the design of a website?
There are laws regulating many things that could be considered "design", for example misleading packaging, mandatory information on some categories of ads, cigarette packaging, container sizes, accessibility requirements, etc.
I would say regulating against addictive design (infinite scrolling is not banned per se, it just makes for a catchy headline) is well within what laws are meant for.
"People are forced into extensions, just to stop wasting their time here."
Except, what actually has happened is that the annoying pop-ups became ubiquitous, and then relatively standardized, so that now an extension like Consent-o-Matic (because the browser companies don't want to upset their advertisers) can automate away your actual choices.
If you want to allow websites to track you, tell the extension to make those choices. If you don't, then tell the extension that. It does a great job almost instantly clearing the popups, and you have more control over your digital identity.
I feel we can't make laws against the subject of moral panic every generation. People have felt the same way about the activities of youths since forever. But ultimately it often turns out fine. Change can bring new problems, but it also brings positives that are hard to understand and even formulate, that's culture. Trying to be the arbiter of that is foolish.
Is ticktock addictive because of it's design, or is it addictive because it brings thousands of people and experiences and emotions right to you? Probably both, but it's hard to separate one from the other. Apps are not opium, it's not as clear cut.
Instead of micromanaging technology and culture they should make sure that society is kind, that there is slack in the system, that people don't have reason to want to flee their real lives, that those hurt by new technology get support.
Of course truly malicious dark patterns and fraud should be punished. But that feels like a different category.
Tiktok will never have any competitors after this law comes into force. They will have the resources the implement the require changes, and the customer base will remain with them. Anyone starting a new service will have a tough time building something that jumps through all the hoops required by the EU, on top of the usual problems with network effects.
1. GDPR consent dialogs are not cookie popups, most things you see are GDPR consent dialogs
2. GDPR consent dialogs are only required if you share data, i.e. spy on the user
3. GDPR had from the get to go a bunch of exceptions, e.g. you don't need permission to store a same site cookie indicating that you opted out of tracking _iff_ you don't use it for tracking. Same for a lot of other things where the data is needed for operation as long as the data is only used with that thing and not given away. (E.g. DDOS protection, bot detection, etc.)
4. You still had to inform the user but this doesn't need any user interacting, accepting anything nor does it need to be a popup blocking the view. A small information in the corner of the screen with a link to the data policy is good enough. But only if all what you do falls under 3. or non personal information. Furthermore I think they recently have updated it to not even require that, just having a privacy policy in a well know place is good enough but I have to double check. (And to be clear this is for data you don't need permission to collect, but like any data you collect it's strictly use case bound and you still have to list how its used, how long stored etc. even if you don't need permissions). Also to be clear if you accept the base premise of GDPR it's pretty intuitive to judge if it's an exception or not.
5. in some countries, there are highly misguided "cookie popup" laws predating GDPR (they are actually about cookies, not data collection in general). This are national laws and such the EU would prefer to have removed. Work on it is in process but takes way to long. I'm also not fully sure about the sate of that. So in that context, yes they should and want to kill "cookie popups". That just doesn't mean what most people think it does (as it has nothing to do with GDPR).
Banning personalized recommendation algorithms altogether would do so much good. Feeds should be the same for everyone who has the same filters and subscriptions configured.
Facinating that they landed on infinite scrolling as the problem to spend time and energy on instead of all the other things happening online that have an impact on society.
Genuinely curious about the actual data on this.
Does anyone have a link to a reputable, sizable study?
This comes from the same EU that's wholeheartedly embracing gambling across their member states, gambling mind you that children can just as easily jump into with their phones and some will, but devastating for grown-ups just as much.
They're not alone in this by any means, America has also opened their doors for all forms of gambling like Kalshi which now even sponsors news networks of all things.
The EU has this disconnect with the things they push, which makes sense considering their size and the speed at which it moves. One example that comes to mind is how they're both pushing for more privacy online while also pushing for things such as chat control which is antithetical to privacy.
Does social media need regulating? Yeah. Is infinite scrolling where they should be focusing? Probably not, there's more important aspects that should be tackled and are seemingly ignored.
Every member state has its own laws for it, and AFAIK all of them now regulate (or ban) online gambling more or less.
There were many startups here in Sweden in the early '00s, and I believe they had taken advantage of a legal loophole which has since been plugged. Regulation has tightened. Players have to be 18 y/o, use digital ID and not be registered as a gambling addict. But I still find the industry to be depraved, to be honest.
In Spain I can’t even have a meal at a restaurant, get groceries or go to IKEA without someone trying to sell me lottery tickets. They really need to regulate that.
I’m mixed on this. I do at times waste a lot of time doom scrolling, and would like regulation to prevent me from doing so. But also some times you just want to doomscroll to escape your day to day life. Do we want this decision to be made by the govt?
I guess we don’t let people have hard drugs even if sometimes they just need to escape their painful life. And maybe this could fall under that logic. But we do let people drink themselves, which serves the same purpose. And if I had to choose, I think doomscrolling is more at the level of Drinking, and less at the level of Heroin. So I would actually be fine with an age limit for doomscrolling after which, you have a hands off approach.
This stuff is important and can only happen at this level through legislation.
If you don't do it this way to apply for everyone, then any good actor products will be crushed by profitmaxxing competitors. Or any good actor executives and workers will be pushed out by profitmaxxing shareholders.
Legislators need to be careful to keep requirements tight and manageable, but it's better to limit negative externalities than outright ban something. Banning infinite scroll or any particular pattern is nonsense, but restricting addictive design (e.g. TikTok) and algorithm weaponization (e.g. TikTok) is very sensible.
> then any good actor products will be crushed by profitmaxxing competitors
This is what I always say, and defended by many economists: The free market needs legislation and enforcement! Especially public companies, which are especially adamant to maximize shareholder profits at any cost.
Fee market only reacts in a positive way by default in matters that are clear to customers, eg pricing. But when the user isn't the customer, and the defects are not immediately sensed, winners will never do the good thing on their accord.
The first part sounded good, the "every platform is different, we have to decide everything case-by-case" and the specific focus on TikTok less so IMO.
Keep in mind that in Europe, TikTok is still run by the original owners with China connections - unlike the new "American TikTok" after the owner change in the US.
The US legislature only seemed to discover its concern about addictive behavior when foreign actors or pro-palestinian content were involved, but had no problem with YouTube or Facebook doing the same stuff.
I seriously hope it's different in the EU but wouldn't bet on it.
Infinitief scrolling is only mentioned in the title. The actual legislation focuses on addictive patterns of which infinite scroll is just one. The exact formulation will of course matter a lot, but it will not simply be banning infinite scroll, as that would be trivial to circumvent.
What does ranking vs. chronological having to do with infinite scrolling?
You can have a ranked paginated UI. You can also have an "infinite" (until you run out of items, but this is not different for ranked) chronological UI.
I will 100% take this approach over the current EU tack of banning kids from accessing the internet. The problem wasn't ever the kids - it is the tech executives trying to profit from making the kids addicted to an advertising platform.
The hunt has started: EU burocracy vs TK. In the past EU has rarely directly attacked a single company with so specific points. But anytime they remained consistent and dedicaded to their target and usually won. It just took a long time (from a few years till decades). The only time they lost a policy was at stopping summer-time switch which was cancelled when Covid started.
They avoid to mention the rest of social media platforms, which happen to be US based. It seems they choose a single quick and easy China-based target more like an experiment to decide for the rest. The key point is when: either the current kids will experience it or those that are not yet born.
even disregarding the fact that this affects multiple US based platforms that are larger than tiktok, have they tried to moderate their platform to not be unfettered brain rot?
What about TV, how come the channels are always playing?? They should shut off after 30mins and I shouldn't be able to press down button to do zapping all night long.
What about video games? We need session limits of 30mins, kids get too addicted to it.
In fact we're going to put a timer in every bedroom so that if you have sex with your wife for too long we'll fine you because it can turn into a real addiction.
Infinite scrolling combined with the algorithmic feed is the real nasty.
Feeds should be heavily regulated, effectively they are a (personalized!) broadcast, and maybe the same strictures should apply. Definitely they should be transparent (e.g. chronological from subscribed topics), and things like veering more extreme in order to drive engagement should be outlawed.
I wish they would go after the fake spinning wheel discount pattern and the "app exclusive" or "better in the app" pattern. That's all a way to get people to install apps that will then bombard folks with notifications or slurp data off the device.
This is the best piece of news I've seen in years. Jonathan Haidt's and others work is finally bearing fruit, assuming our society is not collectively too addicticed to get rid of these palm-sized slot machines.
Infinite scroll itself isn't inherently harmful, it's a pagination mechanism. The harm comes from recommender systems tuned for engagement over wellbeing
Looks like the EU can just get a feature flag to use pagination or a "Load More" button? Doesn't seem as big of a deal as enforcing USB-C.
Though if it applies to the YouTube, seems annoying when trying to find a video to watch. I usually trigger a few infinite scrolling loads to look for videos.
And I assume they'd have to specify a maximum number of items per page, or else devs could just load a huge number of items up front which would technically not be infinite scrolling but enough content to keep someone occupied for a long time.
The EU's mission statement seems to be to make the internet as difficult to legally utilize as possible.
I'm interested to see what measures people will use to get around the increasingly bizarre restrictions. Perhaps an official browser extension for each platform that reimplements bureaucrat-banned features?
Early on in the internet age it somewhat bothered me that every page on the www either acts like it is the first thing one reads on a topic or assumes great knowledge of the subject. With nothing in between.
Wondering about a technical solution I couldn't find anything besides fold out explanations and links to explain jargon. Neither would really bridge the gap.
One obvious theory was to keep track of what the user knows and hide things they don't need or unhide things they do. This is of course was not acceptable from a privacy perspective.
Today however you could forge a curriculum for countless topics and [artificially] promote a great diversity of entry level videos. If the user is into something they can be made to watch more entry level videos until they are ready for slightly more advanced things. You can reward creators for filling gaps between novice and expert level regardless of view count.
Almost like Khan academy but much slower, more playful and less linear.
Imagine programming videos that assume the reader knows everything about each and every tool involved. The algorithm could seek out the missing parts and feed them directly into your addiction or put bounties on the scope.
This was long overdue. I hope killing other dark patterns that feast on attention or hunt on flaws in human psychology follow. However, my only concern is how this will be taken care of. I hope they learned something from the GDPR fiasco.
I admire the EU's attempts at things like the cookie law, age verification, and tackling the addictiveness of infinite scrolling, but the implementation is pure theater.
Trackers have much more effective techniques than "cookies", kids trivially bypass verification, and designers will make a joke of tell me you have infinite scrolling without telling me you have infinite scrolling. When you are facing trillions of dollars of competition to your law, what do you think is going to happen?
Maybe if there was an independent commission that had the authority to rapidly investigate and punish (i.e. within weeks) big tech for attempting engagement engineering practices it might actually have some effect. But trying to mandate end user interfaces is wasting everyone's time putting lipstick on a pig.
Dunno about using legislative moves, but yes please. The stupidest solution to a problem no one had. Moving layouts, unreachable footers, no or unsatisfactory indication of one's position.
All just to remove navigation clicks no one minded and reduce server loads, in exchange for users suffering laggy lazy loading (or, what a hate-inducing pattern!) inability to preload, print, search or link.
Technically this is about Tiktok's "addictive design", and their examples include "infinite scroll over time". It's totally unclear what they mean by that, or what Tiktok would have to change it to in order to be in compliance. The whole thing seems like it was written by a boomer bureaucrat who has never used Tiktok, let alone a computer.
They hate infinite scrolling because it's addictive. I hate infinite scrolling because it's annoying lol. The worst is when you scroll to the bottom of a news article and it just loads another and your scrollbar and your URL/browser history get fucked up.
What happened to… personal responsibility? I hate dark patterns as much as the next person, but this will likely be as effective as the EU’s existing hand wringing (not).
This isn’t about addiction, it’s about censorship. If you limit the amount of time someone can spend getting information, and make it inconvenient with UI changes, it’s much harder to have embarrassing information spread to the masses.
Amazingly, the public will generally nod along anyway when they read governmental press releases and say “yes, yes, it’s for my safety.”
Scrolling through an infinity of AI slop videos can't really be classified as "getting information". If you want to read the news and stay up to date with the "embarrassing information" there's plenty of news websites out there.
>"Social media app TikTok has been accused of purposefully designing its app to be “addictive” by the European Commission, citing its infinite scroll, autoplay, push notification, and recommendation features."
All of these have immediate and easy replacements or workarounds. Nothing will substantially change (for the better; maybe it does for the worse, even).
Moreover, "purposefully designing something to be addictive" (and cheap to make) is the fundamental basis of late stage capitalism.
Some comments were deferred for faster rendering.
jjcm|16 days ago
It's important to note they aren't creating laws against infinite scrolling, but are ruling against addictive design and pointing to infinite scrolling as an example of it. The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes". They point to certain features they'd like them to change, but there is no specific ruling around what you can/can't do.
My initial reaction was that this was a terrible precedent, but after thinking on it more I asked myself, "well what specific laws would I write to combat addictive design?". Everything I thought of would have some way or workaround that could be found, and equally would have terrible consequences on situations where this is actually quite valuable. IE if you disallow infinite scrolling, what page sizes are allowed? Can I just have a page of 10,000 elements that lazy load?
Regardless of your take around whether this is EU overreach, I'm glad they're not implementing strict laws around what you can/can't do - there are valuable situations for these UI patterns, even if in combination they can create addictive experiences. Still, I do think that overregulation here will lead to services being fractured. I was writing about this earlier this morning (https://news.ycombinator.com/item?id=47005367), but the regulated friction of major platforms (ie discord w/ ID laws) is on a collision course with the ease of vibe coding up your own. When that happens, these comissions are going to need to think long and hard around having a few large companies to watch over is better than millions of small micro-niche ones.
Funes-|16 days ago
Hear me out: banning advertising on the Internet. It's the only way. It's the primordial domino tile. You knock that one over, every other tile follows suit. It's the mother of chain reactions. There would be no social media, no Internet as we know it. Imagine having TikTok, YouTube or X trying to survive on subscriptions alone in their current iterations. Impossible. They'd need to change their top priority from "maximizing engagement by fostering addictive behavior" to "offering a product with enough quality for someone to pay a fee in order to be able to use it".
hyperman1|15 days ago
European companies know this pattern, and tend to get the hint. US companies tend to try and maximize what they can get while claiming there is no law against it, then go very pikachu-faced when the consequences hit them.
sincerely|16 days ago
This is not such an unusual thing in law, as much as us stem-brained people want legal systems to work like code. The most famous example is determining art vs pornography - "I know it when I see it" (https://en.wikipedia.org/wiki/I_know_it_when_I_see_it)
johannes1234321|15 days ago
The issue is: If you do a precise wording of what you don't want a lawyer will go through it wird by word and the company finds a way to build something which violated the spirit, but not the exact wording. By being more generic in the wording they can reach such cases and future development with very little oversight for later corrections and courts can interpret the intention and current state of art.
There are areas where law has to be precise (calculation of tax, criteria for criminal offenses, permissions for authorities, ...), but in many cases good laws are just as precise as needed and as flexible as possible.
randomNumber7|16 days ago
coffeemug|16 days ago
There is obviously a lot of detail to work out here-- which specific question do you ask users, who administers the survey, what function do you use scale the fines, etc. But this would force the companies to pay for the addiction externality without prescribing any specific feature changes they'd need to make.
bjackman|15 days ago
So what we have is a machine designed to optimise for something adjacent to addictiveness, and then some rules saying "you can't design for addictiveness"...
What happens when an underspecified vibe rule clashes with a billion dollar optimisation machine? Surely the machine wins every time? The machine is already defeating every ruleset that it's ever come up against.
Feels like the only way regulation could achieve anything is if it said "you can't build a billion dollar optimisation machine at all".
astrobe_|15 days ago
If this law passes and they "blacklist" some of these design-for-addiction (sorry, "engagement") platforms, I believe it should send a strong signal for adults as well. Most adults are pretty much aware that these platforms are bad for everyone; according to some polls, the public opinion is unambiguously in favor of these laws.
braiamp|15 days ago
You can't. You don't need to specify how to comply with the law, just that generally a goal must be met. That's good lawmaking there, since it's flexible enough to catch all future potential creatives way to break it. I remember someone comment about how working at MSFT as a compliance officer, dude was going around saying that it's not the letter of the law that must be followed, but the spirit of thereof. They rolled over him and released the product nonetheless. Almost immediately came the EU investigation and that crap had to be reversed an put in accordance to what the stated goal of the law is.
StopDisinfo910|15 days ago
That's not really accurate. The EU actually legislated in a way which is very typical of how countries regulate things which are now to carry hard to characterized and varied risks.
Companies have to carry out a risk assessment and take appropriate preventive actions when they find something. The EU audits the assessment. That's how finance has been regulated for ages.
It's all fairly standard I fear.
Someone|15 days ago
The EU, in general, phrases laws and regulations more in terms of what they want to accomplish with them than in terms of what you can’t do.
In contrast, common law (https://en.wikipedia.org/wiki/Common_law), over time, more or less collects a list of all things you may not do.
sriku|15 days ago
If a company chooses a design and it can be proved through a subpoena of their communications that the design was intended and chosen for its addictive traits, even if there has been no evidence collected for the addictiveness, then the company (or person) can be deemed to have created a design in bad faith to society and penalized for it.
(Well that's my attempt. I tried to apply "innocent until proven guilty" here.)
roenxi|15 days ago
This doesn't solve the problem though - the enforcers still have to come up with a standard that they will enforce. A line has to be drawn, letting people move the line around based on how they feel today doesn't help. Making the standard uncertain just creates opportunities for corruption and unfairness. I haven't read the actual EU stance on the matter but what you are describing is a reliable way to end up in a soup of bad policy. There needs to be specific rulings on what people can and can't do.
If you can't identify the problem, then you aren't in a position to solve it. Applies to most things. Regulation by vibe-checks is a great way to kill off growth and change - which the EU might think is clever, but the experience over the last few centuries has been that growth and change generally make things better.
And what they actually seem to be doing here is demanding that sites spy on their users and understand their browsing habits which does seem like a terrible approach. I don't see how their demands in that statement align with the idea of the EU promoting digital privacy.
luplex|15 days ago
RamblingCTO|15 days ago
I'm recovering from a surgery and can't do much besides existing. So I'd like to scroll to keep me occupied and numb the pain in my face. But instagram tries to shove content down my throat that I don't want to see. It's always only a matter of time until I see THOT/incel content. No matter how often I click "not interested", they try again and again. If it's not playing genders out against each other, it's politics. It's brain rot. I don't wanna see that. I have interests and they know what they are. But no, they show me this garbage. The algorithms need to be the second thing we need to regulate imho
henrikschroder|15 days ago
These findings are very much in line with that, they bring up a feature, a checkbox, a specific thing TikTok did to pay lip-service to protect minors, and then they're simply saying that it doesn't appear to work. So it doesn't matter that TikTok checked the box and crossed the t.
danpalmer|15 days ago
They haven't nailed it every time, but on the whole it's a good approach. It's hard on companies, but rightly so.
svara|15 days ago
A very common tension in law everywhere.
In the US you now have a 'major questions doctrine'. What the hell is a major question?
lukan|16 days ago
KPGv2|15 days ago
johnnyanmac|15 days ago
I'd make the algorithms transparent, then attack clearly unethical methods on a case by case basis. The big thing about facebook in the 2010's was how we weren't aware of how deep its tracking was. When revealed and delved into, it lead to GDRP.
I feel that's the only precision method of keeping thins ethical.
dakolli|15 days ago
3 hrs a day on your phone is equivalent to 15 years of your life (accounting for a 16 hour waking day). I know people that do a solid 6... That's 30 years of their life scrolling, getting their brains completely fried by social media, and soon the infinite jest machine that is generative AI.
Sorry, we don't let people fry their brains with drugs, well we at least try to introduce some societal friction in between users and the act of obtaining said drug.
bojan|15 days ago
InsideOutSanta|15 days ago
An example of this is contract law. There is no clear definition of what a legal contract must look like. Instead, a contract's validity can depend on whether an average, prudent person would have entered into it in similar circumstances.
iamflimflam1|15 days ago
sophrosyne42|15 days ago
Llamamoe|16 days ago
Only allowing algorithmic feeds/recommendations on dedicated subpages to which the user has to navigate, and which are not allowed to integrate viewing the content would be an excellent start IMO.
ArchieScrivener|15 days ago
No more for profit nets. Time for civil digital infrastructure.
Waterluvian|15 days ago
“You know it when you see it.”
grumbel|15 days ago
Expand the GDPR "Right to data portability" to publicly published content for third parties, i.e. open up the protocols so you can have third party clients that themselves can decide how they want to present the data. And add a realtime requirement, since at the moment companies still circumvent the original rule with a "only once every 30 days" limit.
Also add an <advertisment> HTML tag and HTTP header and force companies to declare all their ads in a proper machine readable way.
The core problem with addictive design isn't the addictive design itself, but that it's often the only way to even access the data. And when it comes to communication services that benefit from network effects, that should simply not be allowed.
kawera|15 days ago
Not necessarily. The consequences of a few bad micro-niche ones would be, well, micro.
SllX|15 days ago
If the EU passes a law that seems general but start giving out specific examples ahead of time, they’re outlawing those specific examples. That’s how they work, even if you read the law closely and comply with the letter of the law. And they’ll take a percentage of your global revenue while people shout “malicious compliance” in the virtual streets if they don’t get their way.
poncho_romero|16 days ago
jameson|15 days ago
Terms like "DAU" or "engagement" is common in our field and the primary objective is how to make users spend more time on our platform. We don't take safety or mental health seriously internally but only externally for PR reasons.
CEOs won't change that because the more time user spends on the platform, the more ad revenue it brings.
Only way is to regulate it.
erxam|16 days ago
The amount of paid shills opposing this is a good indicator that it's the right move.
s_dev|15 days ago
KolibriFly|15 days ago
woodpanel|16 days ago
[deleted]
tokyobreakfast|16 days ago
Or you could just shut the phone off and/or not install the app. It's a simple solution, really, and one that is available at your disposal today at no cost.
unknown|15 days ago
[deleted]
Frannky|15 days ago
mattlutze|15 days ago
It is thoroughly documented that social media and the modern web are designed to be addictive, by psychologists who specialize in this. We regulate access to other addictive things, because addictive things break humans' normal control systems.
> "the farther you get from the influence of bureaucrats, the happier life becomes"
only when things are "normal" and if you're a default power-holder in a community. For everyone else, really no.
pyrale|15 days ago
Counterexample: just look at the state of EU tech companies compared to Chinese tech companies.
I’m not saying China is an attractive example, but chalking up Europe’s tech issues to a regulation problem fails to address europe’s digital woes.
Tade0|15 days ago
A huge portion of that market cap exists only because the companies in question are allowed to act unethically. Aside from that, all this wealth is concentrated in the hands of a small minority.
Ultimately the economy exists to serve us, not the other way around. What good all that market cap does for the average American?
Nemo_bis|15 days ago
mzhaase|15 days ago
jeandejean|15 days ago
This very US lobbyists narrative that Europe regulate while missing out on the economy is used and abused anytime something look like contrary to US interests in MAGA land.
simongray|15 days ago
Europe is actually doing quite well at the moment. The European stock markets have over-performed quite decently vs. the US ever since Trump became president, despite the various curveballs thrown at Europe in recent years. Market capitalisation in the US is held up primarily by the Magnificent 7 who are great outliers in the American stock market.
shevy-java|15 days ago
Also, while I dislike infinite scrolling, why should the EU regulate the design of a website? I don't like this idea as a principle. This clearly comes from overpaid bureaucrats. I am not at all saying the EU should not become stronger, in the face of a very hostile and abusive USA - but the focus by these bureaucrats is wrong. Those micro-regulations will not get rid of US dominance in the software sector.
sbszllr|15 days ago
You’re perpetuating a gross misunderstanding of the cookie law. What it states is different from how the advertisers implement malicious compliance to bias people, like yourself.
Websites that implement basic functional cookies do not need to display any popups. They’re permitted to do so. Any cookies that are essential to the functioning of the website within reason are permitted. In fact at no point a website should serve you a cookie popup unless you seek it out because analytics and advertising cookies are supposed to be opt in.
So many websites do two things, serve you a popup that has everything enabled which is a clear violation; or a popup that has only functional cookies selected but the biggest highlighted button allows all of them.
The law is fine. Malicious compliance is to blame. The EU has been slow to rectify it.
KolibriFly|15 days ago
seszett|15 days ago
There are laws regulating many things that could be considered "design", for example misleading packaging, mandatory information on some categories of ads, cigarette packaging, container sizes, accessibility requirements, etc.
I would say regulating against addictive design (infinite scrolling is not banned per se, it just makes for a catchy headline) is well within what laws are meant for.
mattlutze|15 days ago
Except, what actually has happened is that the annoying pop-ups became ubiquitous, and then relatively standardized, so that now an extension like Consent-o-Matic (because the browser companies don't want to upset their advertisers) can automate away your actual choices.
If you want to allow websites to track you, tell the extension to make those choices. If you don't, then tell the extension that. It does a great job almost instantly clearing the popups, and you have more control over your digital identity.
apexalpha|15 days ago
Try selling something containing Nicotine and you will find the EU has an opinion on that, too.
Valakas_|13 days ago
I say great this is a great initiative from the EU.
neop1x|15 days ago
Ylpertnodi|15 days ago
Probably not. But the dominance will be regulated.
nicman23|15 days ago
geysersam|15 days ago
Is ticktock addictive because of it's design, or is it addictive because it brings thousands of people and experiences and emotions right to you? Probably both, but it's hard to separate one from the other. Apps are not opium, it's not as clear cut.
Instead of micromanaging technology and culture they should make sure that society is kind, that there is slack in the system, that people don't have reason to want to flee their real lives, that those hurt by new technology get support.
Of course truly malicious dark patterns and fraud should be punished. But that feels like a different category.
dgellow|15 days ago
We have clear answers here. Yes it’s by design
sharperguy|15 days ago
peterisza|16 days ago
mcny|16 days ago
Disclaimer: I anal and this is not legal advice.
prmoustache|16 days ago
warmedcookie|15 days ago
ben_w|16 days ago
bubblewand|16 days ago
mattlutze|15 days ago
gib444|16 days ago
kuerbel|16 days ago
dathinab|16 days ago
1. GDPR consent dialogs are not cookie popups, most things you see are GDPR consent dialogs
2. GDPR consent dialogs are only required if you share data, i.e. spy on the user
3. GDPR had from the get to go a bunch of exceptions, e.g. you don't need permission to store a same site cookie indicating that you opted out of tracking _iff_ you don't use it for tracking. Same for a lot of other things where the data is needed for operation as long as the data is only used with that thing and not given away. (E.g. DDOS protection, bot detection, etc.)
4. You still had to inform the user but this doesn't need any user interacting, accepting anything nor does it need to be a popup blocking the view. A small information in the corner of the screen with a link to the data policy is good enough. But only if all what you do falls under 3. or non personal information. Furthermore I think they recently have updated it to not even require that, just having a privacy policy in a well know place is good enough but I have to double check. (And to be clear this is for data you don't need permission to collect, but like any data you collect it's strictly use case bound and you still have to list how its used, how long stored etc. even if you don't need permissions). Also to be clear if you accept the base premise of GDPR it's pretty intuitive to judge if it's an exception or not.
5. in some countries, there are highly misguided "cookie popup" laws predating GDPR (they are actually about cookies, not data collection in general). This are national laws and such the EU would prefer to have removed. Work on it is in process but takes way to long. I'm also not fully sure about the sate of that. So in that context, yes they should and want to kill "cookie popups". That just doesn't mean what most people think it does (as it has nothing to do with GDPR).
creatonez|15 days ago
gunapologist99|15 days ago
I like my feed being personalized to who I am. I am unique. I'm not like anyone else.
degun|15 days ago
puppycodes|16 days ago
Genuinely curious about the actual data on this.
Does anyone have a link to a reputable, sizable study?
lemoncookiechip|16 days ago
They're not alone in this by any means, America has also opened their doors for all forms of gambling like Kalshi which now even sponsors news networks of all things.
The EU has this disconnect with the things they push, which makes sense considering their size and the speed at which it moves. One example that comes to mind is how they're both pushing for more privacy online while also pushing for things such as chat control which is antithetical to privacy.
Does social media need regulating? Yeah. Is infinite scrolling where they should be focusing? Probably not, there's more important aspects that should be tackled and are seemingly ignored.
Findecanor|15 days ago
There were many startups here in Sweden in the early '00s, and I believe they had taken advantage of a legal loophole which has since been plugged. Regulation has tightened. Players have to be 18 y/o, use digital ID and not be registered as a gambling addict. But I still find the industry to be depraved, to be honest.
retired|15 days ago
nicman23|15 days ago
my money (lol) is on that EU will move to complete ban advertising of gambling in the next 2-3 years
sashank_1509|15 days ago
I guess we don’t let people have hard drugs even if sometimes they just need to escape their painful life. And maybe this could fall under that logic. But we do let people drink themselves, which serves the same purpose. And if I had to choose, I think doomscrolling is more at the level of Drinking, and less at the level of Heroin. So I would actually be fine with an age limit for doomscrolling after which, you have a hands off approach.
jeandejean|15 days ago
betteryet|15 days ago
If you don't do it this way to apply for everyone, then any good actor products will be crushed by profitmaxxing competitors. Or any good actor executives and workers will be pushed out by profitmaxxing shareholders.
Legislators need to be careful to keep requirements tight and manageable, but it's better to limit negative externalities than outright ban something. Banning infinite scroll or any particular pattern is nonsense, but restricting addictive design (e.g. TikTok) and algorithm weaponization (e.g. TikTok) is very sensible.
pnt12|15 days ago
This is what I always say, and defended by many economists: The free market needs legislation and enforcement! Especially public companies, which are especially adamant to maximize shareholder profits at any cost.
Fee market only reacts in a positive way by default in matters that are clear to customers, eg pricing. But when the user isn't the customer, and the defects are not immediately sensed, winners will never do the good thing on their accord.
thr0waway001|15 days ago
isametry|15 days ago
xg15|15 days ago
Keep in mind that in Europe, TikTok is still run by the original owners with China connections - unlike the new "American TikTok" after the owner change in the US.
The US legislature only seemed to discover its concern about addictive behavior when foreign actors or pro-palestinian content were involved, but had no problem with YouTube or Facebook doing the same stuff.
I seriously hope it's different in the EU but wouldn't bet on it.
0dayz|15 days ago
The DSA for instance has only been used against to western companies so far (doesn't mean Chinese companies are immune).
rurban|15 days ago
throw_m239339|15 days ago
breezykoi|15 days ago
gunapologist99|15 days ago
One is arbitrarily banned by unelected bureaucrats. The other is fine.
We blame social companies for failing to raise our children the way we think they should.
kvdveer|15 days ago
relaxing|15 days ago
mh2266|15 days ago
You can have a ranked paginated UI. You can also have an "infinite" (until you run out of items, but this is not different for ranked) chronological UI.
observationist|16 days ago
Nemo_bis|15 days ago
swiftcoder|15 days ago
tsoukase|15 days ago
They avoid to mention the rest of social media platforms, which happen to be US based. It seems they choose a single quick and easy China-based target more like an experiment to decide for the rest. The key point is when: either the current kids will experience it or those that are not yet born.
nicman23|15 days ago
bluescrn|15 days ago
There’s a finite (albeit vast) amount of content to serve up.
tefkah|15 days ago
tokyobreakfast|16 days ago
I'm curious how they plan to pretend to enforce this. Will you need a loisence to implement infinite scroll?
vasco|15 days ago
What about video games? We need session limits of 30mins, kids get too addicted to it.
In fact we're going to put a timer in every bedroom so that if you have sex with your wife for too long we'll fine you because it can turn into a real addiction.
randomNumber7|16 days ago
graemep|16 days ago
somewhereoutth|16 days ago
Feeds should be heavily regulated, effectively they are a (personalized!) broadcast, and maybe the same strictures should apply. Definitely they should be transparent (e.g. chronological from subscribed topics), and things like veering more extreme in order to drive engagement should be outlawed.
esprehn|15 days ago
Aerbil313|15 days ago
KolibriFly|15 days ago
the_black_hand|13 days ago
pedroma|16 days ago
Though if it applies to the YouTube, seems annoying when trying to find a video to watch. I usually trigger a few infinite scrolling loads to look for videos.
And I assume they'd have to specify a maximum number of items per page, or else devs could just load a huge number of items up front which would technically not be infinite scrolling but enough content to keep someone occupied for a long time.
dheera|16 days ago
> We use cookies and other technologies to store and access personal data on your device
Evidently you don't value privacy.
unknown|15 days ago
[deleted]
Retr0id|15 days ago
lpcvoid|15 days ago
jama211|15 days ago
badpun|16 days ago
asib|16 days ago
unknown|15 days ago
[deleted]
GaryBluto|15 days ago
I'm interested to see what measures people will use to get around the increasingly bizarre restrictions. Perhaps an official browser extension for each platform that reimplements bureaucrat-banned features?
nicman23|15 days ago
gib444|16 days ago
amelius|16 days ago
unknown|16 days ago
[deleted]
econ|15 days ago
Wondering about a technical solution I couldn't find anything besides fold out explanations and links to explain jargon. Neither would really bridge the gap.
One obvious theory was to keep track of what the user knows and hide things they don't need or unhide things they do. This is of course was not acceptable from a privacy perspective.
Today however you could forge a curriculum for countless topics and [artificially] promote a great diversity of entry level videos. If the user is into something they can be made to watch more entry level videos until they are ready for slightly more advanced things. You can reward creators for filling gaps between novice and expert level regardless of view count.
Almost like Khan academy but much slower, more playful and less linear.
Imagine programming videos that assume the reader knows everything about each and every tool involved. The algorithm could seek out the missing parts and feed them directly into your addiction or put bounties on the scope.
deadbabe|15 days ago
booleandilemma|15 days ago
tartoran|16 days ago
avaer|16 days ago
Trackers have much more effective techniques than "cookies", kids trivially bypass verification, and designers will make a joke of tell me you have infinite scrolling without telling me you have infinite scrolling. When you are facing trillions of dollars of competition to your law, what do you think is going to happen?
Maybe if there was an independent commission that had the authority to rapidly investigate and punish (i.e. within weeks) big tech for attempting engagement engineering practices it might actually have some effect. But trying to mandate end user interfaces is wasting everyone's time putting lipstick on a pig.
CrzyLngPwd|15 days ago
ZoomZoomZoom|16 days ago
All just to remove navigation clicks no one minded and reduce server loads, in exchange for users suffering laggy lazy loading (or, what a hate-inducing pattern!) inability to preload, print, search or link.
phendrenad2|15 days ago
MiddleEndian|15 days ago
kalterdev|15 days ago
breezykoi|15 days ago
oompydoompy74|15 days ago
Havoc|15 days ago
ARandomerDude|16 days ago
This isn’t about addiction, it’s about censorship. If you limit the amount of time someone can spend getting information, and make it inconvenient with UI changes, it’s much harder to have embarrassing information spread to the masses.
Amazingly, the public will generally nod along anyway when they read governmental press releases and say “yes, yes, it’s for my safety.”
cbg0|16 days ago
Lorin|15 days ago
Funes-|16 days ago
>"Social media app TikTok has been accused of purposefully designing its app to be “addictive” by the European Commission, citing its infinite scroll, autoplay, push notification, and recommendation features."
All of these have immediate and easy replacements or workarounds. Nothing will substantially change (for the better; maybe it does for the worse, even).
Moreover, "purposefully designing something to be addictive" (and cheap to make) is the fundamental basis of late stage capitalism.
yencabulator|13 days ago
I'm fine with EU resisting late stage capitalism.
coldtea|15 days ago
slopusila|16 days ago
hopefully AI will wake them up and save us from all this nonsense
unknown|15 days ago
[deleted]
janlucien|15 days ago
[deleted]
unknown|15 days ago
[deleted]