Disucssion of offenses related to: prostitution, drugs, abuse & insults, suicide, "stiring up of racial/religious hatred", fraud and "foreign interference".
So one imagines a university student discussing, say: earning money as a prostitute. Events/memories related to drug taking. Insulting their coursemates. Ridiculing the iconography of a religion. And, the worst crime of all, "repeating russian propaganda" (eg., the terms of a peace deal) -- which russians said it, and if it is true are -- of course -- questions never asked nor answered.
This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA, there may have been zero actual actions involved (consider, though, a majority of UK students have taken class-A drugs at most prominent universities).
This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.
We grew up with the internet being a fun place where fun things happen and you don't need to take it so seriously. It was the symbol of freedom. Then internet evolved into a business center, where everything is taken extremely seriously, don't you dare break the etiquette. It's a sad change to witness, but it is what it is.
I'm no fan of this act but your characterisation is highly misleading.
To pick two examples from the document you linked:
Discussion of being a sex worker would not be covered. The only illegal content relating to sex work would be if you were actively soliciting or pimping. From the document:
* Causing or inciting prostitution for gain offence
* Controlling a prostitute for gain offence
Similarly, discussion of drug use wouldn't be illegal either per se, only using the forum to buy or sell drugs or to actively encourage others to use drugs:
* The unlawful supply, offer to supply, of controlled drugs
* The unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs
* The supply, or offer to supply, of psychoactive substances
* Inciting any offence under the Misuse of Drugs Act 1971
That's very different to criminalising content where you talk about being (or visiting) a prostitute, or mention past or current drug use. Those things would all still be legal content.
> [..] as the highs of repressive christian moralism in the mid 20th C.
What makes you pick the mid-20th century as the high point of repressive christian moralism? That doesn't seem even close to the high point if you look back further in history.
Where does it say discussion of those offences is illegal content? It says "content that amounts to a relevant offence". Frustratingly that is nonsensical: content surely cannot "amount to an offence" in and of itself. Offences have elements, which fall into two categories: actus reus and mens rea. And "content" cannot be either. Perhaps posting some content or possessing some content is the actus reus of an offence but the content itself does not seem to me to sensibly be able to be regarded as "amounting to an offence" any more than a knife "amounts to an offence". A knife might be used in a violent offence or might be possessed as a weapons possession offence but it makes no sense to me to say that the knife "amounts to an offence".
Either way, the point of that document in aggregate seems to be that "illegal content" is content that falls afoul of existing criminal law already: (possession and distribution of) terrorist training material is already illegal and so it is illegal content. But saying that you committed an offence is not, in and of itself, an offence, so saying you took drugs at university doesn't seem to me like it could be illegal content. Encouraging people to do so might be, but it already is.
Maybe I missed the bit where it says discussing things is illegal, so correct me if I am wrong.
> This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA
There's nothing illegal about hosting a forum. The problem is that you as the site operator are legally required to take down certain kinds of content if and when it appears. Small sites with no money or staff don't have the resources to pay for a full time moderator. That cost scales with the number of users. And who knows whats in those 2.6M historical posts.
From TFA:
> The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it
Maybe an LLM can carry some of the load here for free forums like this to keep operating?
All you need to do is have a think about what reasonable steps you can take to protect your users from those risks, and write that down. It's not the end of the world.
That is a very tricky one to manage on an online forum. If an American expresses an opinion about UK policy, in a literal sense that is literally foreign interference. There isn't a technical way to tell propagandists from opinionated people. And the most effective propaganda, by far, is that which uses the truth to make reasonable and persuasive points - if it is possible to make a point that way then that is how it will be done.
The only way this works is to have a list of banned talking points from a government agency. I'd predict that effective criticism of [insert current government] is discovered to be driven mainly by foreign interference campaigns trying to promote division in the UK.
This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue - what is the point? the flat earthers are never going to gain traction and it doesn't matter if they do - the only topics worth suppressing are things that are plausible and persuasive. The topics most likely to turn out to be true in hindsight.
The legislation follows the general structure of the health and safety act a couple of decades ago. That also caused a big right wing press crisis, and then we all sort of moved on, did a bit more paperwork, and now fewer people die in factory accidents. It's really quite helpful to start practically implementing this stuff rather than philosophising about it.
> This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.
Given what has happened to the US as a result of unbridled free broadcast of misinformation and disinformation, we definitely need more "draconian, censorious, illiberal, repressive" rules around the propagation of such media.
Moral panic is EXACTLY what's called for!
You have captains of industry and thought leaders of the governing party throwing fucking nazi salutes, and this is broadcast to the masses! Insanity to defend free speech after the country is circling a drain as a result of said free speech.
Related post with a large discussion from someone who said:
"Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)
[...] I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.
Anyone* would be crazy to run a UK-based or somewhat UK-centric forum today. Whether it be for a hobby, profession, or just social interaction. The government doesn’t perceive these sites as having any value (they don't employ people or generate corporation tax).
[*] Unless you are a multibillion $ company with an army of moderators, compliance people, lawyers.
Well I'm on a forum run by a UK company, hosted in the UK, and we've talked about this, but they're staying online. And, no, they're not a multibillion dollar company.
I don't see our moderators needing to do any more work than they're already doing, and have been doing for years, to be honest.
more than just forums, it's basically a failed state now. I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died, and that would be soon, but I never imagined it would get this bad.
The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.
I'd say "there will be blood on the streets", but there already is...
The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.
A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.
HEXUS stopped publishing in 2021, and the company no longer exists. The forums were kept because they don't take much work to keep online. Now, there's a lot of work to do, like reading hundreds of pages of documents and submitting risk assessments. There's nobody to do that work now, so the idea was it could go into read only mode. The problem with that was, some users may want their data deleted if it becomes read only. Therefore, the only option is to delete it.
Summary: The UK has some Online Safety Act, any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks. The law applies to any site that targets UK citizens or has a substantial number of UK users, where "substantial number" is not defined.
I'm going to guess this forum is UK-based just based on all the blimey's. Also the forum seems to have been locked from new users for some time, so it was already in its sunset era.
The admin could just make it read only except to users who manually reach out somehow to verify their age, but at the same time, what an oppressive law for small UK forums. Maybe that's the point.
> any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks.
But I believe you only need age verification if pornography is posted. There's also a bunch of caveats about the size of user base - Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified whether it applies to / will be policed for, e.g., single-user self-hosted Fediverse instances or small forums.
I don't blame people for not wanting to take the risk. Personally I'm just putting up a page with answers to their self-assessment risk questionnaire for each of my hosted services (I have a surprising number that could technically come under OSA) and hoping that is good enough.
That's quite sizeable. How many sites can you name have 7 million monthly active UK users? That's over one-in-ten of every man, woman and child in the UK every month using your site.
Rather than shut it down, would it be possible to sell the forum to someone in the US for a little bit of money, like $20 or something?
Idea being the US-based owner migrates the DB with posts and user logins to servers hosted on US soil, then if the UK government comes knocking the former owners in the UK can say "Sorry it doesn't belong to us anymore, we sold it, here's the Paypal receipt." (Ideally they'd sell the domain too, but as long as you still have the DB you could always host the forum at a different domain.)
Any forum admins here willing to add another forum to their portfolio?
It's clear this law affects terribly bona fide grassroots online communities. I hope HN doesn't start geoblocking the UK away!
But then online hate and radicalization really is a thing. What do you do about it? Facebook seems overflowing with it, and their moderators can't keep up with the flow, nor can their mental health keep up. So it's real and it's going to surface somewhere.
At some level, I think it's reasonable that online spaces take some responsibility for staying clear of eg hate speech. But I'm not sure how you match that with the fundamental freedom of the Internet.
You don't. "Hate speech" is code for "the government knows better and controls what you say."
Yes, racism exists and people say hateful things.
Hate speech is in the interpretation. The US has it right with the first amendment - you have to be egregiously over the line for speech to be illegal, and in all sorts of cases there are exceptions and it's almost always a case-by-case determination.
Hateful things said by people being hateful is a culture problem, not a government problem. Locking people up because other people are offended by memes or shitposts is draconian, authoritarian, dystopian nonsense and make a mockery of any claims about democracy or freedom. Europe and the UK seem hellbent for leather to silence the people they should be talking with and to. The inevitable eventual blowback will only get worse if stifling, suppressing, and prosecuting is your answer to frustrations and legitimate issues felt deeply but badly articulated.
How so? This is just the UK. While the UK really does want to enforce this globally, they really have no enforcement power against non-UK citizens who do not reside in the UK.
Certainly it's possible (and perhaps likely!) that the EU and US will want to copycat this kind of law, but until that happens, I think your alarm is a bit of an overreaction.
I sympathize with the operators of these forums of course -- the UK Online Safety Act is poorly conceived.
HOWEVER.
Deleting their forums?
"The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it." [1]
This is a false dichotomy. Put Cloudflare in front of the site, block UK traffic [2], and you're done. 5 minute job.
Wow, UK has these crazy laws too? The German hate speech laws made headlines a week or so ago (https://www.cbsnews.com/news/germany-online-hate-speech-pros...). They'll confiscate your electronics if you insult someone and they actively monitor the Internet for prohibited speech.
So sites will geoblock the uk and users will use VPN software. Ugh. More software layers, more waste. Also a problem that is solved by a layer of indirection.
Fear/risk is at work here. Government by clear guidance, not
guesswork is needed. The word "unlikely" is doing too much lifting in
the guidance. OFCOM need to hard clarity with the kind of detail to
satisfy lawyers. OSB is sound in its aims, a fumbled hot potato in its
long-long discussion, a hash of an implementation, and the
explication/communication is a regurgitated dogs dinner. Normally our
gov communication is very good. Why can't OFCOM write? I guess we all
know any forum with more than a few members likely already has
software and some basic policy settings to do this. Unclear guidance
is making operators jumpy and afraid.
An opportunity for anyone with a transformer from "UK.GOV Hand-waving"
-> forum_settings.json
Reality: the forum has negative 358 posts in the last month. The forum has negative ~2k posts over the last 12 months. The forum is so inactive that they’re deleting posts faster than creating them. 8 people have created accounts in the last year.
I've been working with OFCOM on implementing the requirements of this act. They seem reasonable, and what they are looking for is mostly table stakes. That said, I wouldn't want to live in or run a UGC business in the UK right now.
The State of Utopia has published this report on the source of funding of Ofcom, the U.K. statutory regulator responsible for enforcing the Online Safety Act:
Could someone please shed any light on why simply geoblocking the UK in its entirety would not be sufficient for an average forum to avoid having to deal with the Act?
A lot of US websites initially geoblocked EU to avoid dealing with GDPR, for example.
This is a major blow to non-profit communities. Which also means that only for profit will make sense of maintaining such platforms, which in itself is contradictory to what the proposal of the this act is.
[+] [-] mjburgess|1 year ago|reply
Table 1.1: Priority offences by category ( https://www.ofcom.org.uk/siteassets/resources/documents/onli... )
Disucssion of offenses related to: prostitution, drugs, abuse & insults, suicide, "stiring up of racial/religious hatred", fraud and "foreign interference".
So one imagines a university student discussing, say: earning money as a prostitute. Events/memories related to drug taking. Insulting their coursemates. Ridiculing the iconography of a religion. And, the worst crime of all, "repeating russian propaganda" (eg., the terms of a peace deal) -- which russians said it, and if it is true are -- of course -- questions never asked nor answered.
This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA, there may have been zero actual actions involved (consider, though, a majority of UK students have taken class-A drugs at most prominent universities).
This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.
[+] [-] anal_reactor|1 year ago|reply
[+] [-] whiteandnerdy|1 year ago|reply
To pick two examples from the document you linked:
Discussion of being a sex worker would not be covered. The only illegal content relating to sex work would be if you were actively soliciting or pimping. From the document:
* Causing or inciting prostitution for gain offence
* Controlling a prostitute for gain offence
Similarly, discussion of drug use wouldn't be illegal either per se, only using the forum to buy or sell drugs or to actively encourage others to use drugs:
* The unlawful supply, offer to supply, of controlled drugs
* The unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs
* The supply, or offer to supply, of psychoactive substances
* Inciting any offence under the Misuse of Drugs Act 1971
That's very different to criminalising content where you talk about being (or visiting) a prostitute, or mention past or current drug use. Those things would all still be legal content.
[+] [-] eikenberry|1 year ago|reply
What makes you pick the mid-20th century as the high point of repressive christian moralism? That doesn't seem even close to the high point if you look back further in history.
[+] [-] ThinkBeat|1 year ago|reply
[+] [-] milesrout|1 year ago|reply
Either way, the point of that document in aggregate seems to be that "illegal content" is content that falls afoul of existing criminal law already: (possession and distribution of) terrorist training material is already illegal and so it is illegal content. But saying that you committed an offence is not, in and of itself, an offence, so saying you took drugs at university doesn't seem to me like it could be illegal content. Encouraging people to do so might be, but it already is.
Maybe I missed the bit where it says discussing things is illegal, so correct me if I am wrong.
Not your lawyer not legal advice etc etc
[+] [-] josephg|1 year ago|reply
There's nothing illegal about hosting a forum. The problem is that you as the site operator are legally required to take down certain kinds of content if and when it appears. Small sites with no money or staff don't have the resources to pay for a full time moderator. That cost scales with the number of users. And who knows whats in those 2.6M historical posts.
From TFA:
> The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it
Maybe an LLM can carry some of the load here for free forums like this to keep operating?
[+] [-] crimsoneer|1 year ago|reply
[+] [-] roenxi|1 year ago|reply
That is a very tricky one to manage on an online forum. If an American expresses an opinion about UK policy, in a literal sense that is literally foreign interference. There isn't a technical way to tell propagandists from opinionated people. And the most effective propaganda, by far, is that which uses the truth to make reasonable and persuasive points - if it is possible to make a point that way then that is how it will be done.
The only way this works is to have a list of banned talking points from a government agency. I'd predict that effective criticism of [insert current government] is discovered to be driven mainly by foreign interference campaigns trying to promote division in the UK.
This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue - what is the point? the flat earthers are never going to gain traction and it doesn't matter if they do - the only topics worth suppressing are things that are plausible and persuasive. The topics most likely to turn out to be true in hindsight.
[+] [-] crimsoneer|1 year ago|reply
[+] [-] kieranmaine|1 year ago|reply
[+] [-] markdown|1 year ago|reply
Given what has happened to the US as a result of unbridled free broadcast of misinformation and disinformation, we definitely need more "draconian, censorious, illiberal, repressive" rules around the propagation of such media.
Moral panic is EXACTLY what's called for!
You have captains of industry and thought leaders of the governing party throwing fucking nazi salutes, and this is broadcast to the masses! Insanity to defend free speech after the country is circling a drain as a result of said free speech.
[+] [-] jheriko|1 year ago|reply
[deleted]
[+] [-] thrance|1 year ago|reply
[deleted]
[+] [-] ziddoap|1 year ago|reply
"Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)
[...] I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.
and it's all being shut down [...]"
For the same reasons.
https://news.ycombinator.com/item?id=42433044
[+] [-] femiagbabiaka|1 year ago|reply
[+] [-] mikrotikker|1 year ago|reply
[+] [-] nickdothutton|1 year ago|reply
[*] Unless you are a multibillion $ company with an army of moderators, compliance people, lawyers.
[+] [-] whartung|1 year ago|reply
I don't see our moderators needing to do any more work than they're already doing, and have been doing for years, to be honest.
So we'll see how the dice land.
[+] [-] DarkmSparks|1 year ago|reply
The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.
I'd say "there will be blood on the streets", but there already is...
This video pretty much sums up what the UK is now. https://m.youtube.com/watch?v=zzstEpSeuwU
[+] [-] _fjg8|1 year ago|reply
A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.
[+] [-] jonatron|1 year ago|reply
[+] [-] nerdile|1 year ago|reply
I'm going to guess this forum is UK-based just based on all the blimey's. Also the forum seems to have been locked from new users for some time, so it was already in its sunset era.
The admin could just make it read only except to users who manually reach out somehow to verify their age, but at the same time, what an oppressive law for small UK forums. Maybe that's the point.
[+] [-] zimpenfish|1 year ago|reply
> any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks.
But I believe you only need age verification if pornography is posted. There's also a bunch of caveats about the size of user base - Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified whether it applies to / will be policed for, e.g., single-user self-hosted Fediverse instances or small forums.
I don't blame people for not wanting to take the risk. Personally I'm just putting up a page with answers to their self-assessment risk questionnaire for each of my hosted services (I have a surprising number that could technically come under OSA) and hoping that is good enough.
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] mattlondon|1 year ago|reply
That's quite sizeable. How many sites can you name have 7 million monthly active UK users? That's over one-in-ten of every man, woman and child in the UK every month using your site.
[+] [-] csense|1 year ago|reply
Idea being the US-based owner migrates the DB with posts and user logins to servers hosted on US soil, then if the UK government comes knocking the former owners in the UK can say "Sorry it doesn't belong to us anymore, we sold it, here's the Paypal receipt." (Ideally they'd sell the domain too, but as long as you still have the DB you could always host the forum at a different domain.)
Any forum admins here willing to add another forum to their portfolio?
[+] [-] rich_sasha|1 year ago|reply
It's clear this law affects terribly bona fide grassroots online communities. I hope HN doesn't start geoblocking the UK away!
But then online hate and radicalization really is a thing. What do you do about it? Facebook seems overflowing with it, and their moderators can't keep up with the flow, nor can their mental health keep up. So it's real and it's going to surface somewhere.
At some level, I think it's reasonable that online spaces take some responsibility for staying clear of eg hate speech. But I'm not sure how you match that with the fundamental freedom of the Internet.
[+] [-] observationist|1 year ago|reply
Yes, racism exists and people say hateful things.
Hate speech is in the interpretation. The US has it right with the first amendment - you have to be egregiously over the line for speech to be illegal, and in all sorts of cases there are exceptions and it's almost always a case-by-case determination.
Hateful things said by people being hateful is a culture problem, not a government problem. Locking people up because other people are offended by memes or shitposts is draconian, authoritarian, dystopian nonsense and make a mockery of any claims about democracy or freedom. Europe and the UK seem hellbent for leather to silence the people they should be talking with and to. The inevitable eventual blowback will only get worse if stifling, suppressing, and prosecuting is your answer to frustrations and legitimate issues felt deeply but badly articulated.
[+] [-] hexator|1 year ago|reply
[+] [-] kelnos|1 year ago|reply
Certainly it's possible (and perhaps likely!) that the EU and US will want to copycat this kind of law, but until that happens, I think your alarm is a bit of an overreaction.
[+] [-] mbostleman|1 year ago|reply
[+] [-] notatty|1 year ago|reply
[deleted]
[+] [-] massifgreat|1 year ago|reply
[+] [-] jlaporte|1 year ago|reply
HOWEVER.
Deleting their forums? "The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it." [1]
This is a false dichotomy. Put Cloudflare in front of the site, block UK traffic [2], and you're done. 5 minute job.
[1] https://forums.hexus.net/hexus-news/426608-looks-like-end-he...
[2] https://developers.cloudflare.com/waf/custom-rules/use-cases...
[+] [-] chmorgan_|1 year ago|reply
[+] [-] msie|1 year ago|reply
[+] [-] nonrandomstring|1 year ago|reply
An opportunity for anyone with a transformer from "UK.GOV Hand-waving" -> forum_settings.json
[+] [-] omer9|1 year ago|reply
[+] [-] aimazon|1 year ago|reply
Reality: the forum has negative 358 posts in the last month. The forum has negative ~2k posts over the last 12 months. The forum is so inactive that they’re deleting posts faster than creating them. 8 people have created accounts in the last year.
The forum has been long dead.
[+] [-] redm|1 year ago|reply
[+] [-] phendrenad2|1 year ago|reply
[+] [-] logicallee|1 year ago|reply
https://medium.com/@rviragh/ofcom-and-the-online-safety-act-...
(In short it is funded by the regulated tech companies, which must pay fees to it.)
[+] [-] smarx007|1 year ago|reply
A lot of US websites initially geoblocked EU to avoid dealing with GDPR, for example.
[+] [-] motbus3|1 year ago|reply