I'm skeptical about banning design patterns just because people might overuse them. Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention?
The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices? If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming, even email notifications.
My concern isn't whether TikTok's format is uniquely dangerous. It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app. I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.You can't legislate intelligence...
wackget|23 days ago
The average person has zero chance against all-pervasive, ultra-manipulative, highly-engineered systems like that.
It is, quite simply, not a fair fight.
TheOtherHobbes|23 days ago
It's not just social media. It's gaming, ad tech, marketing, PR, religion, entertainment, the physical design of malls and stores... And many many more.
The difference with social media is that the sharp end is automated and personalised, instead of being analysed by spreadsheet and stats package and broken out by demographics.
But it's just the most obvious poison in a toxic ecosystem.
SirMaster|23 days ago
So you are saying I am not an average person because I have the willpower to simply not install the TikTok app or watch short form video on any platform?
Has the bar for the average person really sunk this low?
rzz3|23 days ago
Additionally, Instagram and Facebook have tried their best to make their products as addictive as possible, yet their recommendation algorithm is so absolutely terrible (not to mention their ads) that I barely stay on the platform for five minutes when I use it.
bondarchuk|23 days ago
Lerc|23 days ago
At the same time I don't think you can demonstrate harm without good evidence.
Making money can not be used as a criteria unless you want to draw the conclusion that no company can turn a profit and be ethical at the same time. It would amount to demanding an outcome that you don't believe us possible.
I think considering overly broad criteria, like say, infinite scroll applied selectively to a few is just arbitrarily targeting candidates for reasons unstated outside the criteria.
The rules need to be evidence based, clear, specific, and apply to all.
Cracking down on ticktok while The Guardian has a bunch of dark patterns. Or the NYT, who is reporting on this while at the same time attracting people with online games that have an increasingly toxic user interface.
Tiktok may suck, but so do a lot of other businesses that escape scrutiny. I worry the harms attributed to TikTok are magnified to allow them to be a whipping boy drawing the focus allowing systemic issues to persist.
eggy|18 days ago
yibg|23 days ago
If my restaurant's food is so good people are "addicted" to it, that's a good thing. If it's about applying psychological patterns to trigger the addictive behavior that applies to a large swath of marketing.
moi2388|23 days ago
I’d love to think of myself as an exceptional individual because I don’t use Facebook or TikTok, but most likely I’m not exceptional at all, and other people could also just not use TikTok.
luxuryballs|23 days ago
trcf23|23 days ago
I’m quite glad that there is a form of control preventing a company from a different part of the world that don’t really care about the mental health or wellbeing of my kids to creep into their life like that…
As a parent, it’s not a fair fight and I should not have to delegate that to another private company
unknown|23 days ago
[deleted]
forgotaccount3|23 days ago
Fixed that for you.
Your argument is basically the same as saying that Banana Ball should be banned because they are intentionally making the experience as fun as possible, because that's how they make money.
stronglikedan|23 days ago
amarant|23 days ago
I'm not some sort of prodigy or anything, just a random schmuck. If I can do it, anyone can. People just really like blaming others for their own vices instead of owning up to having a vice.
HN is a vice too. One of many that I have. And they're all mine. I've chosen them all. In most cases knowing full well that I probably shouldn't have.
kalterdev|23 days ago
Supermancho|23 days ago
Spoiler: There is no line. Societies (or more accurately, communities) attempt to self-regulate behaviors that have perceived net-negative effects. These perceptions change over time. There is no optimal set of standards. Historically, this has no consideration for intelligence or biology or physics (close-enough-rituals tended to replace impractical mandates).
mtoner23|23 days ago
dmix|23 days ago
candiddevmike|23 days ago
travoc|23 days ago
bibimsz|22 days ago
D-Machine|23 days ago
permo-w|23 days ago
I'm not saying these feeds are as bad as heroin for your health, but the difference in addictiveness between cliff-hangers in film and tiktok on your phone is about as big, and I equally think nicotine should be heavily regulated for similar reasons
Etheryte|23 days ago
afarah1|23 days ago
Is there strong evidence for that? The first study that pops up if I search this suggests otherwise, that it could increase consumption of sugar-substitutes and overall caloric intake. https://doi.org/10.1016/j.tjnut.2025.05.019
>we need guardrails to defend against
There is no "we". You say that I and others need it, and you want to impose your opinion by taxing us.
dlcarrier|23 days ago
By the logic of the court decision, anything that is entertaining should be banned, from movies to TV shows to any news that makes any analysis whatsoever.
kranke155|23 days ago
maxehmookau|23 days ago
Except, I'll never be given that choice.
hahn-kev|23 days ago
gtowey|23 days ago
Can you imagine if gambling were allowed to be marketed to children? Especially things like slot machines. We absolutely limit the reach of those "design patterns".
7tflutter7|23 days ago
seydor|23 days ago
direwolf20|22 days ago
GorbachevyChase|23 days ago
direwolf20|22 days ago
derektank|23 days ago
hollerith|23 days ago
The argument against tiktok (and smartphones in general) is not that experiences above a certain threshold of compellingness are bad for you: it is that filling your waking hours with compelling experiences is bad for you.
Back when he had to travel to a theatre to have them, a person was unable to have them every free minute of his day.
enaaem|23 days ago
morshu9001|23 days ago
Juliate|23 days ago
We do it for alcohol and cigarettes already: taxes, ads & marketing restrictions, health warning mandated communication.
unknown|23 days ago
[deleted]
andrei_says_|23 days ago
Also do we trust adults prescribed oxytocin to manage their use?
We are speaking of weaponized addiction at planetary scale.
croes|23 days ago
That’s why we ban harmful things.
Refreeze5224|23 days ago
turtlesdown11|23 days ago
WarcrimeActual|23 days ago
thisislife2|23 days ago
plagiarist|23 days ago
morshu9001|23 days ago
direwolf20|22 days ago
zbentley|23 days ago
I think there's a wide regulatory spectrum between those extremes--one that all sorts of governments already use to regulate everything from weapons to software to antibiotics.
It's easy to cherry-pick examples where regulation failed or produced unexpected bad results. However, doing that misses the huge majority of cases where regulation succeeds at preventing harms without imposing problematic burdens on people. Those successes are hard to see because they're evidenced by bad outcomes failing to happen, things working much as they did before (or getting worse at a slower rate than otherwise might happen).
It's harder to point to "nothing changed" as a win than it is to find the pissed-off minority who got denied building permits for reasons they disagree with, or the whataboutists who take bad actions by governments as evidence that regulation in unrelated areas is doomed to failure.
wackget|23 days ago
HA!
ripped_britches|23 days ago
Au contraire
swiftcoder|23 days ago
I mean, that's specifically fine because we have ample evidence to suggest it's just kind of a shit way to watch shows, and Netflix continually taking their own business model out back and shooting it doesn't really warrant government intervention
nunez|23 days ago
cvoss|23 days ago
I once heard some try to understand pornography addiction by asking if it was comparable to a desire to eat a lot of lemon cookies. To quote Margaret Thatcher, "No. No. No."
> Where do we draw the line
Just because it's hard to find a principled place to draw the line doesn't mean we give up and draw no line. If you are OK with the government setting speed limits, then you're OK with lines drawn in ways that are intended to be sensible but are, ultimately, arbitrary, and which infringe on your freedom for the sake of your good and the public good.
> trust adults
Please do not forget the children.
> You can't legislate intelligence
Your implication is that people who are addicted to TikTok or anything else are unintelligent, dumb, and need to be educated. This is, frankly, an offensive way to engage the conversation, and, worse, naive.
wasmainiac|23 days ago
Apples to oranges.
I can’t make meth in my basement as a precursor to some other drug then complain that my target product had a shitty design.
Real life experience shows that TikTok is harmfully addictive and therefore it must be controlled to prevent negative social outcomes. It’s not rocket science, we have to be pragmatic based on real life experience, not theory.
direwolf20|22 days ago
xp84|23 days ago
Arguably, the best reason for the government to care is that whoever controls this algorithm, especially in a future when it’s twice as entrenched as it is today, has an unbelievably unfair advantage in influencing public opinion.
grayhatter|23 days ago
I used to be opposed, now I'm not. I strongly believe human specialization is the important niche humans have adapted, and that should be encouraged. Another equally significant part of human nature is, trust and gullibility. People will abuse these aspects of human nature to give themselves an unfair advantage. If you believe lying is bad, and laws should exist to punish those who do to gain an advantage. Or if you believe that selling an endless, and addictive substance should restricted. You already agree.
There's are two bars in your town, and shady forms of alcohol abound. One bar is run by someone who will always cut someone off after they've had too many. And goes to extreme lengths to ensure that the only alcohol they sell is etoh. Another one is run by someone who doesn't appear to give a fuck, and is constantly suggesting that you should have another, some people have even gone blind.
I think a just society, would allow people to specialize in their domain, without needing to also be a phd in the effects of alcohol poisoning, and which alcohols are safe to consume, and how much.
> Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention?
Yes, the dopamine feedback loop of short form endless scrolling has a significantly different effect on the brain's reward system. I guess in line with how everyone shouldn't need to be a phd, you also need people to be able to believe the conclusions of experts as well.
> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?
It's not as linear of a distinction. We don't have to draw the line of where we stop today. It's perfectly fine to iterate and reevaluate. Endless scroll large data source algorithm's are, without a doubt, addictive. Where's the line on cigarettes or now vapes? Surely they should be available, endlessly to children, because where do you draw the line?
(It's mental health, cigarettes and alcohol are bad for physical health, but no one (rhetorical speaking) gives a shit about mental health)
> If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming,
I'd love to ban micro transactions and loot boxes (gambling games) for children.
> even email notifications.
reductive ad absurdism, or perhaps you meant to make a whataboutism argument?
> My concern isn't whether TikTok's format is uniquely dangerous.
Camels and Lucky Strike are both illegal for children to buy.
> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app.
We clearly do. Companies are taking advantage of the natural dopamine system of the brain for their advantage, at the expense of the people using their applications. Mental health deserves the same prioritzation and protection as physical health. I actually agree with you, banning some activity that doesn't harm others, only a risk to yourself, among reasonably educated adults is insanely stupid. But that's not what's happening.
> I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.
I'd rather see companies that use an unfair disparity of power, control, knowledge and data, be punished when they use it to gain an advantage over their consumers. I think dark patterns should be illegal and come with apocalyptic fines. I think tuning your algorithm's recommendation so that you can sell more ads, or one that recommends divisive content because it drives engagement, (again, because ads) should be heavily taxed, or fined so that the government has the funding to provide an equally effective source of information or transparency.
> You can't legislate intelligence...
You equally can't demand that everyone know exactly why every flavor of snake oil is dangerous, and you should punish those who try to pretend it's safe.
Especially when there's an executive in some part of the building trying to figure out how to get more children using it.
The distinction requiring intervention isn't because these companies exist. The intervention is required because the company has hired someone who's job is to convince children to use something they know is addictive.
direwolf20|22 days ago
jaco6|23 days ago
[deleted]
computerthings|23 days ago
[deleted]
DaanDL|23 days ago
ElevenLathe|23 days ago
In short, banning hard drugs is very very obviously a losing policy that serves only to enrich the world's worst people at the expense of everyone else.
lII1lIlI11ll|23 days ago
Is this a serious question? Have you been asleep since 70s and are not aware on how the War on Drugs has been going?
direwolf20|22 days ago
sven_8127642|23 days ago
The science tends to back these ideas up. Banning does not stop people from doing what they want.
Education and guard rails are always better than hard control.
rektomatic|23 days ago