> Discord dropped the hammer: mandatory age verification for all users is rolling out next month. The era of anonymous gaming chats is officially over.
This isn't really accurate. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
Here's the exact list of what's restricted if you don't verify:
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Discord says they'll use some AI garbage tool. Those are prone to mistakes over a large enough userbase. It will not be a rare occurrence for an adult to be labelled a child until they debase themselves with a scan of their face or a copy of their government ID.
For children - this mandate also still makes the decision on behalf of the parents that a child must submit a scan of their face to a third party. Moving to Persona for age verification involves verification data being sent outside of the user's phone - in direct contradiction to Discord's initial promise of keeping facial scan data solely on the phone. Third parties that we've been given no reason to trust will delete the data without using it for an improper purpose such as creating derivative info from the ID or facial scan itself unrelated to the sole purpose of verifying that an individual is an adult.
While we're at it - is there any legitimate reason why Discord is associating a person's actual or estimated age with their account as opposed to storing a value that states if they are or are not an adult? That sort of granularity seems unrelated to the stated purpose.
1. Given the bad press, they may reverse their decision to do this.
2. If they don't about-face, there's a lot about the implementation that remains to be seen.
Personally, I use discord for things that should be completely unaffected by this. I will not verify my age if there are surprises. I'll leave. If the communities I'm a part of decide to move, I'll support them and move even if I don't run into surprises.
There is absolutely no way we should support giving identifying information to a U.S. company given what's going on right now. The trust is no longer there. If you verify your identity, anything you say on Discord could be used against you if you ever pass through American borders.
> This is false. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
You are correct. For now. But why would they stop there?
Supposedly this is to protect teens. If that's true, why would they continue letting teens chat with anonymous users? What if they get tricked into sharing sensitive images or video of themselves? Surely we need to know everyone's ID to ensure teens aren't unwittingly chatting with a known predator. It's for their safety. But for now that's a bridge too far. For now.
And why should we believe this even has anything to do with protecting teens? That's valuable data. Discord says they're not holding onto it... for now. But Discord is offering quite a lot to users for free. Why let such an obvious revenue source go unmonetized? They're doing this now because they're going public soon. Investors want an ROI and this action is sure to invite some competition. The people leaving want an alternative, so a competitor could get a foothold. Discord needs to stay ahead. And the users Discord keeps after this stunt are going to be the most resilient to leaving - the most exploitable. Surely they wouldn't care if the policy changes in the future.
Seeing more and more of 'This message is unavailable' - 'Discord requires ID in order to see certain messages'
Pretty much an AI detecting vulgarity and blocking it, although actual racist, vulgarity gets through things like 'here with my gock' to 'troll it' are what I've seen.
So, yes it is a requirement, and yes, they are censoring people and things, and requiring others to have an ID to see the messages as well.
So 'Not mandatory for all accounts' is technically true, but I mean.. you get it, hopefully.
> You will be able to join a Discord with your friends, chat, and do voice without age verification.
No, building a community is a goal for many; this just isn't acceptable.
> So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
Again, not mandatory but creates more issues than it solves.
I wonder if the people who write these articles realize that they are doing more damage to their cause than good? At best, their lies come off as hysteria. As worse, their lies come off as conspiratorial paranoia. Either way, they are outright ignoring that these polices are put into place to address a very real problem with the status quo while failing to communicate what the very real issues with these policies are (nevermind proposing better ways to address the problem).
1. A way for politicians and the state to track porn habits to US citizens and use that information against them in the future. Blackmail for the future politicians, business leaders, and wealthy to coerce them into doing what those in power want.
2. A way for conservatives to tighten the noose around non-chaste materials and begin to purge them from the internet. And if that works, that's hardly the last thing that will go. Next will be LGBT content, women's rights content, atheist content, pro-labor content, and more. (Or if you're on the other side of the political spectrum, consider that the powers could be used to remove Christian content, 2nd Amendment content, etc. It doesn't really matter what is being removed, just that the mechanisms are in place and that powers can put a lid on the populace.)
We aren't screaming loudly enough.
Do not try to sugar coat this with a pedantic mistake.
This is far worse.
It's a first step down a path the Big Brother state wants.
> For the majority of adult users, we will be able to confirm your age group using information we already have. We use age prediction to determine, with high confidence, when a user is an adult. This allows many adults to access age-appropriate features without completing an explicit age check.
> Facial scans never leave your device. Discord and our vendor partners never receive it. IDs are used to get your age only and then deleted. Discord only receives your age — that’s it. Your identity is never associated with your account.
> We leverage an advanced machine learning model developed at Discord to predict whether a user falls into a particular age group based on patterns of user behavior and several other signals associated with their account on Discord. We only use these signals to assign users to an age group when our confidence level is high; when it isn't, users go through our standard age assurance flow to confirm their age. We do not use your message content in the age estimation model.
I work with corporate privacy all of the time, and there is actually something really interesting going on here. We're basically never allowed to claim legal compliance using heuristics or predictive models. Like, never ever. They demand a paper trail on everything, and telling our legal team that we are going to leave it to an algorithm on a user device would make them foam at the mouth.
They are basically trusting a piece of software to look at your face or ID in the same way that, like, a server at a restaurant would check before serving you alcohol.
I am curious to see if this kind of software compliance in the long run is even allowable by regulators.
For the United Kingdom specifically, I've suffered the misfortune of reading the Online Safety Act, and this kind of age estimation is both mentioned and permitted by the Act.
(Not a lawyer blah blah blah)
Part 3, Chapter 2, Section 12(4) specifies that user-to-user service providers are required to use either age verification or age estimation (or both!) to prevent children from accessing content that is harmful to children.
Section 12(6) goes on to state that "the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child."
Part 12, Section 230(4) rules out self-declaration of age as being a form of age verification/estimation.
So I suppose it'll come down to whether or not Ofcom deems Discord's age estimation as "highly effective".
This is unrelated, but something I find interesting is that Category 1 user-to-user services (of which Discord is one, as per The Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025) are required by Part 4, Chapter 1, Section 64(1) to "offer all adult users of the service the option to verify their identity (if identity verification is not required for access to the service).".
Its particularly interesting for the Australian laws (which don't target Discord yet). The law places responsibility on the targeted platform if they are found with underage users. They must take 'reasonable steps' or face fines. It will be interesting if/when court cases appear. Will easily spoofed or tricked facial scans be considered 'reasonable' by the Australian courts? I think once the dust has settled we will start seeing some court cases and discover how reasonable some of these fig-leaves are.
Even wilder - they're claiming to look at a user's activity on the platform - like what servers they're on, what games they play, and what hours they're active - and infer adulthood from that. No way that'd pass legal muster.
> If you're staying on Discord, enjoy the surveillance. For the rest of us: it's time to learn how to self-host.
Hmmm. I feel like self-hosting is the FASTEST way to lose your anonymity. Your self hosted service is MUCH more easily tied to your identity than some third party like discord.
Just imagine you set up a self-hosted forum where you want to discuss something you want to keep private, but the government is very interested and wants to know who you are talking to.
Well, now they know any IP address connecting to your forum is a person of interest. They don't need to decrypt anything to know you are talking to each other.
By using something unique, you are going to make yourself uniquely identifiable.
Control over your data is part of anonymity. Sure, everyone will know the service belongs to you, but you'll be in total control over who knows what exactly. To most people not in the eye of the law, that is most of the anonymity they require.
Also, services like TOR exist. Both on the hosting and user side.
I used to use discord for games a long time ago, but then I noticed it was being used for more serious stuff. For example, LLVM moved their chats from mailinglists to discord, as did a lot of open source projects. These are big, important projects, and now their chats are not discoverable and will soon be behind ID checks.
It really bothered me that so many important projects were relying on a proprietary chat technology instead of using mailinglists or IRC which were more decentralized and under the control of the local admin.
I would like to get back to a situation in which you can participate in group chats for open source projects without these being hosted on closed platforms, but if this results in major open source projects shifting from discord to telegram or whatsapp, then nothing will have been learned.
Discord has always had dark patterns which basically ban anonymity. If you aren't fingerprintable enough (using VPN, etc) they will force you to enter a phone number. They also encourage guild admins to require it, although it is technically a choice.
Yeah any level of "anonymity" on Discord died long ago, if it ever existed in the first place. For me, any platform which doesn't enforce Tor is considered very NOT anonymous - even if its accessible through Tor, I don't trust it that much if I'm one of the only people actually using Tor to access the platform.
SimpleX seems trustworthy enough, with thoughtful design decisions, even if it fails my "forced tor" requirement. I haven't spent the time to dive into Session's architecture, but it's on my to-do list, currently the marketing copy makes it look like the best choice.
It covers more than that, but it's not strictly mandatory.
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
What gets deemed “adult” is incredibly random as far as I can tell, some of our servers/messages have triggered it, but no porn or anything is shared in them.
> It’s time accept the loss of “features” and go back to something simpler
I guess I have a hard time understanding these calls to switch to a platform that has even fewer features than the unverified Discord accounts. The blog post is incorrect in claiming that verification will be mandatory. It will only be necessary to access certain features and content. For simple IRC-style chats or even for voice chats with gaming friends, no verification is required.
The average Discord user, or even the 98th percentile user, isn’t going to be looking to switch to a platform that isn’t a replacement for the features they use. They’re just going to not verify their accounts and move on.
I still use a few niche IRC channels and run my own internal IRC network as a home automation message bus, so I'm a fan of IRC for its simplicity, but honestly: IRC really does need a modernization.
Things like image embeds, "markdown lite" formatting, and cross-device synchronization are now considered table stakes. There are always going to be some EFnet-type grognards who resist progress because reasons, but they should be ignored.
IRCv3 and Ergo support some of what's needed already (and in a backwards-compatible way!) but client support just isn't there yet, particularly on mobile.
It's time to accept that 99% of people will not accept the loss of "features" (not sure why that's in quotes) or move to something objectively inferior for their needs i.e. something that requires more knowledge instead of simply opening an app where everything is ready to use.
Coming from a former heavy IRC user who's not going back except for nostalgia trips.
Discord doesn't care about user privacy nor user security and actively retaliates against people.
Their DPO ignored a PII leak I discovered and reported last year. Their dpo mail address just creates a zendesk ticket, I was able to view the ticket was locked and marked "solved" with no response a few days later.
So, I brought it to the Dutch DPA, who were very responsive, and on the same day as their "final update" email, my nearly-decade old Discord account was suddenly "suspended" hours later.
The PII leak, which had been ongoing for over a year before my discovery at that point was suddenly stopped the same day. Funny how that works.
It took 5+ months for Discord's DPO and informal disputes team to finally get back to me after informing them of the retaliation, with irrelevant copy & paste templates giving me walk through guides on how to file a "trust and safety" ticket.
When filing a ticket with "trust and safety" under appeal categories I get an automated "please appeal your ban through the app! I am now closing this ticket" response and my ticket's locked once again. And of course, appealing through the app gives me a generic system error.
The section of people who care about anonymity has always been fairly small, even on niche communities like Hacker News. As an example, the most popular comment on the Australian social network ban for teens is in favour: https://news.ycombinator.com/item?id=46208348
Watching as things play out, I understand why people try to target discord et al. with their complaints about the loss of anonymity. Being a tiny minority they have no hope to influence their governments because the opposite position is widely popular.
Therefore, they try to convince commercial entities to disregard these laws as much as possible. This is particularly useful for that niche since fighting legislation cannot in itself be done anonymously. Therefore, they attempt to transform a very nonymous (haha) entity to do the fighting on their behalf. If the attempt fails, no harm befalls them.
I think it's a doomed endeavour. To get users on discord, it has to be portrayed to parents as a safe and legal service. The days of underground BBSes are gone. Now, if your brand gets associated with anything negative you're toast. And realistically the anonymous users are kind of useless as a whole. They won't pay, so they're practically just a drag on your platform. Losing them risks not very much.
Overall, a fight with a bygone conclusion. If you want anonymity you have to use other tools and be aware that simply using those tools marks you out as someone who desires anonymity.
1) Addictive design of many social networks (doom scrolling et al.)
2) Privacy & age verification
On 1) most parents would support a legal limit on digital media use by age. But it's not a realistic requirement.
Next best thing is to outlaw social media that results in addictive scrolling behaviour. Treating it the same way as smoking is not ideal, but no better solutions have been proposed. Many people on HN wouldn't mind if FB, TikTok and Insta were treated the same way as cocaine. I.e. only available for a lot of money to people who are happy to break the law.
On 2) there are ways to implement technical solutions, that would allow the government to provide a privacy conscious service that would allow businesses to check if someone is 16+ or 18+ without collecting any other information. These services can be gamed. But that's not the point. A 14 year old could become addicted to cocaine and we wouldn't usually blame the policy for it.
The problem is the government tries to solve problem 1) now, while the solution for 2) is being discussed.
Again, a law that limits social media use for under 16 year olds is necessary. But so is a toolset that would enable a plausible age check, and limit the desire of FANG (and their Chinese competitors) to target minors.
Whatever happened to TorChat? Hopefully we are entering another golden age of "dissident" tech that seems to have culturally stagnated for at least a decade now.
In reality, we are likely about to get yet another data point on where the lines for the average person really lie in the dimensions of functionality, friction, network effect and privacy.
There are those that will stay on Discord because the benefits of the first three outweigh the degradation of privacy. Then there are those that will leave because the first three aren't important enough to outweigh the privacy loss. There will be all sorts of people in between.
HN has a rather amplified showing of folks who won't trust anything unless it's completely decentralized using E2EE clients verifiably compiled from source that they'be personally audited running on hardware made from self-mined rare metals. The reality is that there is a spectrum of folks out there all with different preferences and while some folks will leave (in this case) Discord, others will remain because thats where the folks they want to chat/game/voice with.
During the haydays of WhatsApp updating their privacy terms, there was a month long battle where people kept saying how everyone will quit. The reality is, most people just don’t care about these things if their network is highly entrenched in one app. What happens is, some communities “partially move” the sensitive stuff to like Signal and others.
Honestly, all of these are documented probabilities at this point. SNS owners can do very decent predictions on what will happen if they introduce certain kind of friction. Also, it’s not 2005 anymore, people are used to upload their IDs everywhere. I mentioned it before as well, if you’ve used any large app, the chances are you’ve uploaded your ID (AirBnB, Tinder, and etc.)
Discord has always felt more personal than just voice comms over a game. The way you can see more and more about what your friends are doing - like what song on spotify or how long you're playing fortnite, and how many days in a row you're playing these games.
I feel like it has always been on this path to capture more and more of your data and personally link it to who you are.
Anonymity and Discord sounds funny when used in the same sentence. They've always been pretty greedy about user data and had hard to avoid phone verification for a very long time.
I guess this is a good thing. It will reduce spammers and scammers that are invading every server like locusts these days.
It will reduce attacks on and abuse of people, because those are usually founded on anonymity (no fear for repercussions etc.)
I don't mind having a platform where everyone is at least somehow verified. yes, sure, you can bypass it and it is not 100% foolproof but what ever is? It raises the barrier for abuse and that's a good thing IMHO
I just hope they accelerate this with a complete ID requirement everywhere, so I can finally forcibly kick my addiction to such time sinks and interact with my friends in more direct ways.
If you're looking for an alternative, I'm building flotilla.social, a self-hostable chat app built on nostr, which uses cryptographic identities to prevent identity capture.
Which specific use cases does this provide an alternative for? Chat is a tiny part of what people do with Discord and there are plenty of options already.
Nice, nostr is a way better setup than bluesky imo. The way you just roll your identity like you roll a bitcoin wallet. Love it. I'll keep an eye on your app.
Edit: it does look a little too corporate for me though with the 'book a demo' and the focus on my 'mission'. Doesn't really give hanging out with friends vibes. Just saying.
Discord was never about anonymity. There is no e2e. There is IP loging, emails for registering... Author switching anonymity with privacy. Which will end soon on Discord.
It seems to work fine for thousands, if not millions of people all over the world. I mean not even twitter died like everyone claimed it would. Discord will just lose all people who have any reason to not verify their ID, which is probably mostly spammers and scammers, seeing how they invaded almost all servers in the last months
To access age-gated parts of Discord, you need to verify your age. This sounds reasonable. It's not much different to having your id card checked when purchasing alcohol. Actually it's better as you only have to do it once, not on each visit.
Only if the shop assistant took your ID, photocopied it and stored it in a box marked “do not touch” under the counter, alongside transcriptions of everything you ever say inside the store.
Hmm no, because in the case of purchasing alcohol the ID check is 1:1, in time and in space, it's ephemeral (unless the clerk has extreme photographic memory).
In the case of an online-based ID check, even with nice looking privacy terms, there is no guarantee that your ID won't be stored forever and/or re-analyzed many times cross-checking with other services, and worse leaked.
These articles feel like an overreaction. I use Discord daily and I don't think there is any reason for me to verify at all. The new restrictions are reasonable and don't affect the way I use the app.
What's really more distressing is that it got this far before people figured out the game--maybe we should be reflecting on that part, the gullibility and the enabling of those people by those who knew better.
After they were roasted by the 2022–2023 Pentagon doc leaks, it was pretty obvious they were going to take action.
And not just that event: Parents are roasting Roblox for kids getting groomed, but after the relationship is initiated, the groomers always immediately the convo to Discord.
Now the problem becomes would you rather trade convenience with privacy. Many people rather trade away privacy nowadays because they thought they got nothing up their sleeves.
Aurornis|17 days ago
This isn't really accurate. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
Here's the exact list of what's restricted if you don't verify:
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Taken from the announcement https://discord.com/press-releases/discord-launches-teen-by-...
So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
rockskon|17 days ago
For children - this mandate also still makes the decision on behalf of the parents that a child must submit a scan of their face to a third party. Moving to Persona for age verification involves verification data being sent outside of the user's phone - in direct contradiction to Discord's initial promise of keeping facial scan data solely on the phone. Third parties that we've been given no reason to trust will delete the data without using it for an improper purpose such as creating derivative info from the ID or facial scan itself unrelated to the sole purpose of verifying that an individual is an adult.
While we're at it - is there any legitimate reason why Discord is associating a person's actual or estimated age with their account as opposed to storing a value that states if they are or are not an adult? That sort of granularity seems unrelated to the stated purpose.
beloch|17 days ago
2. If they don't about-face, there's a lot about the implementation that remains to be seen.
Personally, I use discord for things that should be completely unaffected by this. I will not verify my age if there are surprises. I'll leave. If the communities I'm a part of decide to move, I'll support them and move even if I don't run into surprises.
There is absolutely no way we should support giving identifying information to a U.S. company given what's going on right now. The trust is no longer there. If you verify your identity, anything you say on Discord could be used against you if you ever pass through American borders.
CivBase|17 days ago
You are correct. For now. But why would they stop there?
Supposedly this is to protect teens. If that's true, why would they continue letting teens chat with anonymous users? What if they get tricked into sharing sensitive images or video of themselves? Surely we need to know everyone's ID to ensure teens aren't unwittingly chatting with a known predator. It's for their safety. But for now that's a bridge too far. For now.
And why should we believe this even has anything to do with protecting teens? That's valuable data. Discord says they're not holding onto it... for now. But Discord is offering quite a lot to users for free. Why let such an obvious revenue source go unmonetized? They're doing this now because they're going public soon. Investors want an ROI and this action is sure to invite some competition. The people leaving want an alternative, so a competitor could get a foothold. Discord needs to stay ahead. And the users Discord keeps after this stunt are going to be the most resilient to leaving - the most exploitable. Surely they wouldn't care if the policy changes in the future.
The sky isn't falling. But the frog is boiling.
vpShane|17 days ago
Pretty much an AI detecting vulgarity and blocking it, although actual racist, vulgarity gets through things like 'here with my gock' to 'troll it' are what I've seen.
So, yes it is a requirement, and yes, they are censoring people and things, and requiring others to have an ID to see the messages as well.
So 'Not mandatory for all accounts' is technically true, but I mean.. you get it, hopefully.
> You will be able to join a Discord with your friends, chat, and do voice without age verification.
No, building a community is a goal for many; this just isn't acceptable.
> So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
Again, not mandatory but creates more issues than it solves.
wolvoleo|17 days ago
I know not everyone is so open but in the lgbt space most people are.
orbital-decay|17 days ago
hidden80|17 days ago
tbrownaw|17 days ago
> >Content Filters:
Sound like something people might not want tied to real-world identities.
> >Age-gated Spaces:
So, #politics in my local instance.
Retr0id|17 days ago
II2II|17 days ago
echelon|17 days ago
1. A way for politicians and the state to track porn habits to US citizens and use that information against them in the future. Blackmail for the future politicians, business leaders, and wealthy to coerce them into doing what those in power want.
2. A way for conservatives to tighten the noose around non-chaste materials and begin to purge them from the internet. And if that works, that's hardly the last thing that will go. Next will be LGBT content, women's rights content, atheist content, pro-labor content, and more. (Or if you're on the other side of the political spectrum, consider that the powers could be used to remove Christian content, 2nd Amendment content, etc. It doesn't really matter what is being removed, just that the mechanisms are in place and that powers can put a lid on the populace.)
We aren't screaming loudly enough.
Do not try to sugar coat this with a pedantic mistake.
This is far worse.
It's a first step down a path the Big Brother state wants.
Yell.
Scream.
Protest.
legitster|17 days ago
> For the majority of adult users, we will be able to confirm your age group using information we already have. We use age prediction to determine, with high confidence, when a user is an adult. This allows many adults to access age-appropriate features without completing an explicit age check.
> Facial scans never leave your device. Discord and our vendor partners never receive it. IDs are used to get your age only and then deleted. Discord only receives your age — that’s it. Your identity is never associated with your account.
> We leverage an advanced machine learning model developed at Discord to predict whether a user falls into a particular age group based on patterns of user behavior and several other signals associated with their account on Discord. We only use these signals to assign users to an age group when our confidence level is high; when it isn't, users go through our standard age assurance flow to confirm their age. We do not use your message content in the age estimation model.
I work with corporate privacy all of the time, and there is actually something really interesting going on here. We're basically never allowed to claim legal compliance using heuristics or predictive models. Like, never ever. They demand a paper trail on everything, and telling our legal team that we are going to leave it to an algorithm on a user device would make them foam at the mouth.
They are basically trusting a piece of software to look at your face or ID in the same way that, like, a server at a restaurant would check before serving you alcohol.
I am curious to see if this kind of software compliance in the long run is even allowable by regulators.
Just_Harry|17 days ago
Part 3, Chapter 2, Section 12(4) specifies that user-to-user service providers are required to use either age verification or age estimation (or both!) to prevent children from accessing content that is harmful to children. Section 12(6) goes on to state that "the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child."
Part 12, Section 230(4) rules out self-declaration of age as being a form of age verification/estimation.
So I suppose it'll come down to whether or not Ofcom deems Discord's age estimation as "highly effective".
[Part 3, Chapter 2, Section 12(4)]: https://www.legislation.gov.uk/ukpga/2023/50/part/3/chapter/...
This is unrelated, but something I find interesting is that Category 1 user-to-user services (of which Discord is one, as per The Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025) are required by Part 4, Chapter 1, Section 64(1) to "offer all adult users of the service the option to verify their identity (if identity verification is not required for access to the service).".
stubish|16 days ago
duskwuff|17 days ago
cortesoft|17 days ago
Hmmm. I feel like self-hosting is the FASTEST way to lose your anonymity. Your self hosted service is MUCH more easily tied to your identity than some third party like discord.
Just imagine you set up a self-hosted forum where you want to discuss something you want to keep private, but the government is very interested and wants to know who you are talking to.
Well, now they know any IP address connecting to your forum is a person of interest. They don't need to decrypt anything to know you are talking to each other.
By using something unique, you are going to make yourself uniquely identifiable.
mid-kid|17 days ago
Also, services like TOR exist. Both on the hosting and user side.
carefree-bob|17 days ago
It really bothered me that so many important projects were relying on a proprietary chat technology instead of using mailinglists or IRC which were more decentralized and under the control of the local admin.
I would like to get back to a situation in which you can participate in group chats for open source projects without these being hosted on closed platforms, but if this results in major open source projects shifting from discord to telegram or whatsapp, then nothing will have been learned.
renato_shira|17 days ago
[deleted]
digiown|17 days ago
RHSeeger|17 days ago
nerdsniper|17 days ago
SimpleX seems trustworthy enough, with thoughtful design decisions, even if it fails my "forced tor" requirement. I haven't spent the time to dive into Session's architecture, but it's on my to-do list, currently the marketing copy makes it look like the best choice.
SapporoChris|17 days ago
7777332215|17 days ago
herpdyderp|17 days ago
I thought age verification was only required to access "adult" content?
jsheard|17 days ago
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
girvo|17 days ago
movedx|17 days ago
I’ll be building a new platform on these two technologies and using Zoom or something else like Jitsi on the side for video/audio sharing.
It’s time accept the loss of “features” and go back to something simpler but also something that can still be here in 38 years — like IRC has been.
Aurornis|17 days ago
I guess I have a hard time understanding these calls to switch to a platform that has even fewer features than the unverified Discord accounts. The blog post is incorrect in claiming that verification will be mandatory. It will only be necessary to access certain features and content. For simple IRC-style chats or even for voice chats with gaming friends, no verification is required.
The average Discord user, or even the 98th percentile user, isn’t going to be looking to switch to a platform that isn’t a replacement for the features they use. They’re just going to not verify their accounts and move on.
stackghost|17 days ago
Things like image embeds, "markdown lite" formatting, and cross-device synchronization are now considered table stakes. There are always going to be some EFnet-type grognards who resist progress because reasons, but they should be ignored.
IRCv3 and Ergo support some of what's needed already (and in a backwards-compatible way!) but client support just isn't there yet, particularly on mobile.
outime|17 days ago
Coming from a former heavy IRC user who's not going back except for nostalgia trips.
zanellato19|17 days ago
hnthrowaway6323|17 days ago
Their DPO ignored a PII leak I discovered and reported last year. Their dpo mail address just creates a zendesk ticket, I was able to view the ticket was locked and marked "solved" with no response a few days later.
So, I brought it to the Dutch DPA, who were very responsive, and on the same day as their "final update" email, my nearly-decade old Discord account was suddenly "suspended" hours later. The PII leak, which had been ongoing for over a year before my discovery at that point was suddenly stopped the same day. Funny how that works.
It took 5+ months for Discord's DPO and informal disputes team to finally get back to me after informing them of the retaliation, with irrelevant copy & paste templates giving me walk through guides on how to file a "trust and safety" ticket.
When filing a ticket with "trust and safety" under appeal categories I get an automated "please appeal your ban through the app! I am now closing this ticket" response and my ticket's locked once again. And of course, appealing through the app gives me a generic system error.
arjie|17 days ago
Watching as things play out, I understand why people try to target discord et al. with their complaints about the loss of anonymity. Being a tiny minority they have no hope to influence their governments because the opposite position is widely popular.
Therefore, they try to convince commercial entities to disregard these laws as much as possible. This is particularly useful for that niche since fighting legislation cannot in itself be done anonymously. Therefore, they attempt to transform a very nonymous (haha) entity to do the fighting on their behalf. If the attempt fails, no harm befalls them.
I think it's a doomed endeavour. To get users on discord, it has to be portrayed to parents as a safe and legal service. The days of underground BBSes are gone. Now, if your brand gets associated with anything negative you're toast. And realistically the anonymous users are kind of useless as a whole. They won't pay, so they're practically just a drag on your platform. Losing them risks not very much.
Overall, a fight with a bygone conclusion. If you want anonymity you have to use other tools and be aware that simply using those tools marks you out as someone who desires anonymity.
sixsevenrot|16 days ago
1) Addictive design of many social networks (doom scrolling et al.) 2) Privacy & age verification
On 1) most parents would support a legal limit on digital media use by age. But it's not a realistic requirement. Next best thing is to outlaw social media that results in addictive scrolling behaviour. Treating it the same way as smoking is not ideal, but no better solutions have been proposed. Many people on HN wouldn't mind if FB, TikTok and Insta were treated the same way as cocaine. I.e. only available for a lot of money to people who are happy to break the law.
On 2) there are ways to implement technical solutions, that would allow the government to provide a privacy conscious service that would allow businesses to check if someone is 16+ or 18+ without collecting any other information. These services can be gamed. But that's not the point. A 14 year old could become addicted to cocaine and we wouldn't usually blame the policy for it. The problem is the government tries to solve problem 1) now, while the solution for 2) is being discussed.
Again, a law that limits social media use for under 16 year olds is necessary. But so is a toolset that would enable a plausible age check, and limit the desire of FANG (and their Chinese competitors) to target minors.
int32_64|17 days ago
modernpacifist|17 days ago
There are those that will stay on Discord because the benefits of the first three outweigh the degradation of privacy. Then there are those that will leave because the first three aren't important enough to outweigh the privacy loss. There will be all sorts of people in between.
HN has a rather amplified showing of folks who won't trust anything unless it's completely decentralized using E2EE clients verifiably compiled from source that they'be personally audited running on hardware made from self-mined rare metals. The reality is that there is a spectrum of folks out there all with different preferences and while some folks will leave (in this case) Discord, others will remain because thats where the folks they want to chat/game/voice with.
tokioyoyo|17 days ago
Honestly, all of these are documented probabilities at this point. SNS owners can do very decent predictions on what will happen if they introduce certain kind of friction. Also, it’s not 2005 anymore, people are used to upload their IDs everywhere. I mentioned it before as well, if you’ve used any large app, the chances are you’ve uploaded your ID (AirBnB, Tinder, and etc.)
biosubterranean|17 days ago
I feel like it has always been on this path to capture more and more of your data and personally link it to who you are.
unknown|17 days ago
[deleted]
orbital-decay|17 days ago
dark-star|17 days ago
It will reduce attacks on and abuse of people, because those are usually founded on anonymity (no fear for repercussions etc.)
I don't mind having a platform where everyone is at least somehow verified. yes, sure, you can bypass it and it is not 100% foolproof but what ever is? It raises the barrier for abuse and that's a good thing IMHO
ChrisArchitect|17 days ago
Welcoming Discord users amidst the challenge of Age Verification
https://matrix.org/blog/2026/02/welcome-discord/
(https://news.ycombinator.com/item?id=46995046)
zitterbewegung|17 days ago
rozab|17 days ago
This is a lie, this only affects you if you want to view porn/nsfw channels on discord. I'm in the UK happily using it without age verification.
mawadev|17 days ago
jonstaab|17 days ago
Kye|17 days ago
wolvoleo|17 days ago
Edit: it does look a little too corporate for me though with the 'book a demo' and the focus on my 'mission'. Doesn't really give hanging out with friends vibes. Just saying.
t0bia_s|17 days ago
2OEH8eoCRo0|17 days ago
SalariedSlave|16 days ago
this is a big problem - if individuals switch to something else, they will lose access to popular Discord communities.
not sure what solution there is for this, as it's unrealistic that all communities would switch to the same alternative (if at all)
bossyTeacher|17 days ago
nottorp|17 days ago
opengrass|17 days ago
docker run --name ircd -p 6667:6667 inspircd/inspircd-docker
nipperkinfeet|17 days ago
dark-star|17 days ago
pjmlp|17 days ago
burnt-resistor|17 days ago
DANmode|17 days ago
unkoman|17 days ago
phendrenad2|17 days ago
bashington|17 days ago
lambdas|17 days ago
nnx|17 days ago
In the case of an online-based ID check, even with nice looking privacy terms, there is no guarantee that your ID won't be stored forever and/or re-analyzed many times cross-checking with other services, and worse leaked.
2001zhaozhao|17 days ago
agnishom|17 days ago
ai_critic|17 days ago
What's really more distressing is that it got this far before people figured out the game--maybe we should be reflecting on that part, the gullibility and the enabling of those people by those who knew better.
josefritzishere|17 days ago
multisport|17 days ago
laerus|17 days ago
1317|17 days ago
nipponese|17 days ago
And not just that event: Parents are roasting Roblox for kids getting groomed, but after the relationship is initiated, the groomers always immediately the convo to Discord.
zoobab|17 days ago
anarticle|17 days ago
stevefan1999|17 days ago
rvz|17 days ago
Image what will happen post-IPO.
Supermancho|17 days ago
dancemethis|17 days ago
Did they forget it's proprietary, and from the same person that made OpenFeint, which also had a privacy lawsuit?
analog8374|17 days ago
endo_dev_null|17 days ago
[deleted]
ilovefrog|17 days ago
[deleted]
486sx33|17 days ago
[deleted]
dgxyz|17 days ago