top | item 34478098

Supreme Court allows Reddit mods to anonymously defend Section 230

230 points| taubek | 3 years ago |arstechnica.com | reply

310 comments

order
[+] codingdave|3 years ago|reply
To be clear, from the article: "Reddit received special permission from the Supreme Court to include anonymous comments from Reddit mods in its brief."

This is not saying the court decided anything - it is just saying it is allowing commentary from mods to be submitted to the court, without knowing their identity.

[+] restes|3 years ago|reply
Since the mods’ handles are known, isn’t this pseudonymous rather than anonymous?
[+] KRAKRISMOTT|3 years ago|reply
If things go wrong and contempt of court charges need to be issued, will Reddit be subpoenaed for the mods identities?
[+] dehrmann|3 years ago|reply
That matches what the title says
[+] 8note|3 years ago|reply
Does that mean I can get a comment memorialized with the supreme court as a Reddit mod?
[+] loeg|3 years ago|reply
It's Ars Technica. If the headline wasn't misleading, they wouldn't publish it. They are among the worst publications to regularly show up on this site.
[+] Bystander22|3 years ago|reply
There were some interesting observations on r/law, particularly this one: https://www.reddit.com/r/law/comments/10h9vju/supreme_court_...

From there:

>The issue has been muddied by sites like reddit who have been keen to play up outlandish possibilities of individual users or volunteer moderators becoming liable, which has never been a likely outcome of this case. The bigger and more realistic threat to a site like reddit is the possibility that actions taken by tools like automoderators, slur filters, or recommendation algorithms (e.g., sorting by "hot") might become legally analogous to editorial decisions.

>Because those tools are sometimes set up by moderators and influenced by user actions (e.g. voting/reporting), there are sort of fringe or edge-case scenarios where the lines could potential blur between algorithmic policies and user/moderator actions. But we as users don't really need to worry too much about every conceivable edge-case legal theory, because a site like reddit would presumably be incentivized to remove or disable any tools that could create such a liability, to protect Reddit's own self-interest.

>The algorithms that keep people clicking/viewing/refreshing the site are critical to the business interests of sites like Reddit and Youtube. It's really important for reddit's bottom line to have broad latitude to gamify user engagement by showing more of what will keep people on reddit longer and more-frequently. That's a less flattering PR angle than playing up the possibility that reddit users or mods could get in legal trouble.

>It's not so much that there is no possible way that any ramification of this case could ever put a user or a mod of a site like reddit in any jeopardy in any conceivable scenario...It's more like, sites like Reddit have a lot to lose if their algorithmic recommendations should become legally analogous to editorial decisions.

[+] jedberg|3 years ago|reply
You know how the lawyers get upset when us engineers practice airchair lawyering? This is like the reverse of that.

Hot is not a recommendation algorithm -- everyone has the same hot list. It's literally just a sort of votes.

In fact, almost nothing reddit does is custom to the user. It's all based on votes and other user actions. The only thing custom is the recommended sort, which anyone can turn off, by choosing the other sorts.

Saying that reddit is algorithmic would be akin to saying that voting for President is an "algorithm" because it adds up user votes and is biased because the voters are biased.

[+] nonethewiser|3 years ago|reply
> automoderators, slur filters, or recommendation algorithms

These are nothing if not editorial decisions

[+] DannyBee|3 years ago|reply
FWIW - The supreme court generally will allow just about any amicus brief.

So this is not particularly special in that respect.

[+] willnonya|3 years ago|reply
While I support upholding section 230 I find irony in reddit proclaiming their support for users and everyday citizens. The majority of the site is overrun by zealots and trolls who relentlessly punish any non-conformance woth their chosen orthodoxy. Their moderation rarely maintains communities but rather restricts discussion, dissent and freedom of expression.

This is like the Stazi proclaiming support for privacy laws.

[+] llanowarelves|3 years ago|reply
A while back, somebody made a moderator graph, that showed what dozens and dozens of subreddits individual moderators moderated. You could see their "web".

It not only confirmed the leanings and behavior but undeniably so.

It is why "redditor" exists as an insulting term, but fb/twitter/etc don't have one. Not necessarily just the users, but the mods too.

[+] paulpauper|3 years ago|reply
This was way worse during Covid , I know, even though before Covid it was already very bad. People were banned just for asking questions, merely for dissenting from the narrative, which was constantly changing, like about masks.
[+] nobody9999|3 years ago|reply
>While I support upholding section 230 I find irony in reddit proclaiming their support for users and everyday citizens. The majority of the site is overrun by zealots and trolls who relentlessly punish any non-conformance woth their chosen orthodoxy. Their moderation rarely maintains communities but rather restricts discussion, dissent and freedom of expression.

So what?

There's nothing in Section 230 or any other US law (this is a US case, so I'll focus there only) that makes being a "zealot" or a "troll" or "relentlessly punish[ing] any non-conformance woth[sic] their chosen orthodoxy" or "restrict[ing] discussion, dissent and freedom of expression" in a private actor's[0] forum a criminal act or (with the exception of defamation[1]) a civil tort.

Don't like what folks say or how they moderate? Ignore them. Or sue them (most likely a waste of time and money, but don't let that stop you).

[0] https://www.theverge.com/2019/6/17/18682099/supreme-court-ru...

[1] https://www.law.cornell.edu/wex/defamation

[+] berjin|3 years ago|reply
If anything there are not enough trolls and devils advocates. It's a total hive mind. If opinion is split 40/60 on a contentious topic users only get to see the prominent opinions from the slight majority. The way the voting is displayed a sum total rather than up/down means users can't know how contentious something is. Facebook is the opposite where mostly controversial comments are shown. Both suck.
[+] ilyt|3 years ago|reply
That's a bit of an overstatement, it varies heavily from subreddit to subreddit.
[+] srj|3 years ago|reply
If you haven't already I recommend reading the text of section 230. It's very short and takes only a minute or two: https://www.law.cornell.edu/uscode/text/47/230
[+] badrabbit|3 years ago|reply
I don't think it should be repealed but its protections should only apply to users and content providers acting in good faith and best effort to proactively prevent harmful content. Revenge porn for a moderate example: not verifying the provenance if the content and validity of the submitter by the porn site or users who knowningly upvote or positively comment should not be protected. There needs to be an incentive beyond the goodwill of site owners and users. Look at twitter with elon changing policy with allowing harmful content but reducing its reach. He is able to do that due to this law.
[+] lucb1e|3 years ago|reply
I'm apparently out of the loop enough to understand all the words in the article but still have no idea what it's talking about. From context, I'm getting that it's something to do with volunteering and needing anonymity and immunity for when you then accidentally allow terrorists recruit followers? And there already exists law number 230 for this but the lawsuit tries to get it declared invalid? Can someone share maybe a short comment on what actually is going on here?
[+] wolverine876|3 years ago|reply
I'm concerned about the US Supreme Court's ability to handle a subtle, technical issue tied to new media. They need to take into account all the stakeholders, rights, law, and the dynamics of social media, and come up with an innovative solution that meets all needs fairly and clearly.

I wonder how many there even use social media, and I'm especially concerned that the Court is now oriented toward, and many members selected for, partisanship. They are there to find partisan advantage in rulings, not to be legal geniuses with deep commitment and knowledge of justice and fairness, with deep judicial temperment - there are not there as Solomons. That puts them at a loss for complex issues, expecially unfamiliar ones, though I'm sure they will find a partisan angle.

Whatever your politics: The reactionary conservative movement, with its campaign to politicize everything (now working on the FBI and Department of Justice, for example), has permanently degraded the country; we won't have these institutions back for generations. People don't want to face the loss, but it's already happened and continues to worsen before our eyes.

EDIT: People who support politicization (or corruption or disinformation or other damaging behavior) argue to normalize it - it's always that way, everyone does it, it's unavoidable, it's 'human nature'. I have warmongers now telling me that it's inevitable human nature. But that's not the case; we can have meaningfully less or more partisanship (especially in courts), corruption, disinformation, and warfare. I can see it with my own eyes now; I was here before 2016, and I know about other places and times and people. It's a bunch of nonsense and everybody knows it.

We control our fate, through knowledge and reason, through a collective commitment to good. Our predecessors did it, without the institutions and mechanisms and knowledge they bequeathed to us. With our inheritance couldn't have it easier; what are we bequeathing to the next generation? Despair? Corruption and war? What a shame that would be, with all we were given.

[+] tptacek|3 years ago|reply
Our court system routinely handles far more technical issues than online publishing. Like every modern country, we regulate everything from water reclamation to aviation.
[+] golemotron|3 years ago|reply
> I'm concerned about the US Supreme Court's ability to handle a subtle, technical issue tied to new media. They need to take into account all the stakeholders, rights, law, and the dynamics of social media, and come up with an innovative solution that meets all needs fairly and clearly.

Nope. That's what legislators do.

The Supreme Court's job is to decide whether safe harbor in Section 230 applies to companies when they are exercising editorial control.

[+] xyzzyz|3 years ago|reply
What do you mean that it is now oriented towards partisanship? This has been true for at least a century. The recent Dobbs decision is as much the result of partisan efforts to institute policy opposed by the majority of the country as the original Roe decision was.

The court has handled complex technical issues many times before. This, along with partisanship, is nothing new.

[+] jliptzin|3 years ago|reply
Today everyone has instant access to virtually all human knowledge at all times in the palms of their hands yet somehow we are less informed and more susceptible to obvious propaganda than prior generations.
[+] vxNsr|3 years ago|reply
Pretending that the court only became politicized in 2016 is the most naive retconning I’ve seen on HN.

The progressive movement intentionally politicized the court in the 1960s against the strong warnings of the most ardent conservative the court has ever had. They went on to have a heyday of progressive wins. Now it’s the time for the conservatives’ revenge.

Speaking of other institutions that were politicized pre-2016… Obama deputized the IRS against conservative orgs and then had his lackeys destroy the evidence (failed hdds anyone). He used the FBI like his personal police force to go after his strongest political enemies. Pretending for a second that it’s the conservatives who are politicizing anything is just pure alternative history, aka fiction.

[+] jlawson|3 years ago|reply
>The reactionary conservative movement, with its campaign to politicize everything

I'm sorry but which movement came up with the slogan 'the personal is political'?

Which one has entire academic departments dedicated to 'problematizing' everything from the skin color of LOTR orcs to dog walking?

[+] LinuxBender|3 years ago|reply
RFC 2119 agrees [1] as does US law [2] with exception of Illinois apparently. Seems to be ill defined and not preferred in the UK, sometimes deemed inappropriate [3]. Perhaps the UK will interpret as per RFC-6919 [4] instead.

1. MUST This word, or the terms "REQUIRED" or "SHALL", mean that the definition is an absolute requirement of the specification.

[1] - https://www.rfc-editor.org/rfc/rfc2119

[2] - https://www.law.cornell.edu/wex/shall

[3] - https://www.law-office.co.uk/art_shall-1.htm

[4] - https://www.rfc-editor.org/rfc/rfc6919

[+] layer8|3 years ago|reply
"The issue of recommendations arises in this case because the complaint alleges the defendants were recommending ISIS terrorist recruiting videos, which under certain circumstances could give rise to liability under the Anti-Terrorist Act,"

It seems to me there is an issue with the notion of “recommendation” here. There needs to be a difference between an intent to promote specific content, vs. a content-agnostic algorithm that just lists what “other users who liked what you liked also liked”. If (say) some Reddit moderators promote terrorist content, they should certainly be liable (or the editorial rules that require them to do so, if any). But when Reddit or YouTube shows you stuff based purely on statistical correlations of what you and other users viewed/liked in the past, then that mechanism in itself shouldn’t be an issue.

[+] zwkrt|3 years ago|reply
This doesn’t make any sense at all to me. If the video itself is a problem, it shouldn’t matter if it was recommended by a human with human intents or by an algorithm designed with capitalistic intent.

It would be hard to convince me that the intent of the system that recommended the video should be heavily considered. At least in the US we have a highly consequentialist legal system.

[+] sytelus|3 years ago|reply
I highly doubt if this specific court will make decisions based on what’s good for users or society. They are dead set on philosophy that if it is really that important for users or society than lawmakers should be enacting laws as opposed to Supreme Court sneaking in the protections. So, these people are pursuing wrong line of argument and likely fail. They should instead be pursuing rigorous argument that original intent of Section 230 was to provide the protection they are asking for.
[+] kjellsbells|3 years ago|reply
I feel that if you are going to brief the Supreme Court, you ought to come out of the shadows. But ok, let's let that pass. Perhaps the more intriguing issue with Reddit mods is that the really far reaching subreddits are controlled by so few people, and that moderation control is essentially for sale. See this old thread from HN

https://news.ycombinator.com/item?id=23173018

It seems to me then that Reddit would strongly prefer not to pierce the veil of anonymity behind moderation because it will lead to some very uncomfortable questions for a business that has a perennial hope of IPO-ing in 2023.

[+] rootsudo|3 years ago|reply
So, reddit supports reddit mods, that offer free, uncompensated labor for subreddit communities.

And no one questions that, or how subreddits can skew opinion or possible conflict of interest with some mods directly referring or supporting paid content while masquarading as if they aren't related to the product.

Or, mods that dictate art isn't art, or that an artist didn't create something.

As much as I do enjoy how the Internet flourished because of 230, I don't enjoy how there is no transparency in someone who can shape content of an online community, especially forwarding their own opinions/thoughts or have their own agenda.

I don't see how the above issue of Reddit mods conflict of interests is not a 230 issue.

[+] Sunspark|3 years ago|reply
There are some great Reddit mods, but there are also some truly awful ones and the corporation does not have a proper arbitration process for managing things.

Example fictional scenario, but entirely plausible as variations of this do play out, on a computing sub you could write "I like Windows" and the Apple Mac-loving Reddit mod could immediately ban you. No rules were broken, you just expressed a contrary opinion to theirs and that is it. So if you wish to continue participating in that sub, you would need to generate and use an alt account.

Reddit threatens that having more than 1 account is against policy. Ignore this. You are the product on a free platform that generates corporate revenue through ads and selling digital awards, etc. If you do not engage with the platform by putting up posts and writing comments, there is less incentive for others to come visit as well, and ad impressions will diminish, revenue will diminish, etc. They will not actively seek out preventing access to the site if you are not breaking any laws or upsetting users. Do not get invested in accounts, were you to die tomorrow, nobody at all would remember you or care about anything you wrote there.

It is negligent on the part of Reddit to not have a proper arbitration process to grieve improper content moderation on the part of mods.

So yes, Reddit absolutely does take an active and direct hand in promoting the visibility of anything on the platform and should not be exempt from section 230. I am active in a sub that every day sees a lot of posts that I find interesting deleted by the mods. They are just curating. Sometimes things are deleted for the dumbest of reasons. Corporate interests come into play too. I remember the other month when Kanye said that Kim Kardashian and CP3 got together while both were married, the NBA Reddit mods for over 24 hours were ACTIVELY deleting every single post mentioning or linking to that. It was certainly basketball news, it was certainly salacious. Why was this happening? Good question! Was the suppression due to receiving an order from the NBA? Was it under orders from Reddit Corporate? Was it just simply a group of Reddit mods working overnight in a coffee shop deleting posts? It was far too targeted and for too long a time period to not be an active attempt at speech suppression until it was already out on too many other news sites, at which time an "approved" site like TMZ would be allowed through where they could presumably get ad-click impressions from diminished traffic to their story about it. They should tell the Supreme Court who gave the orders to suppress the Kanye story on a sub with 6 million+ subscribed accounts. You see this news-story preferences on other subs too, where some sites seemingly often have their links given preferential treatment, and others do not get to come through. Why? Is there a kickback? I don't know, but stories coming through are worth money as traffic is directed.

I love old Reddit, but I despise how it was set up to have little anonymous dictators for life seemingly entrenched forever in their little fiefdoms. No elections, no votes, no recourse other than having more than 1 account.

If Reddit is serious about wanting exemption from section 230, then if they want to be a social commons with community moderation they need to implement an arbitration process OR allow users to hold elections on which mods they want to represent them for fixed terms.

Dictators-for-life from anonymous mods (who also have alt accounts and are probably Reddit employees on the largest subs) is not it.

The anonymous mods giving Supreme Court testimony should state whether or not they are now, or have ever been an employee of Reddit or its investors or associated companies.

[+] DharmaPolice|3 years ago|reply
I don't know anything about /r/nba but from observations of other subs it seems just as likely that a few mods had a private discussion and said "Should we allow this?" and decided no, and then just went ahead and tried to enforce that. Once they've come to decision to not allow something they're not going to let it pass just because a lot of people are reposting it. If you give up after deleting the first 100 threads it makes you look pretty stupid.

On /r/soccer what's happened a few times is some story will come up like the one you mentioned, every single thread on the topic is deleted and then a day or so later there will be some announcement they had another discussion and have changed their mind so will allow one or more threads to be posted. There was a similar thing on /r/worldnews a few years ago when there was a spate of sexual assaults in Cologne at a Christmas/New Year parade. Initially all threads on this were removed on the grounds this wasn't really world news, but since it made the front page of many international papers was the main story on the BBC, etc - that initial judgement looked pretty stupid and threads were allowed.

I mean, it could be "kickbacks" driving this thing but honestly a lot of it is easy to explain without recourse to that. It's still arbitrary but not necessarily corrupt.

[+] malwarebytess|3 years ago|reply
> if Section 230 is weakened by the court. Unlike other companies that hire content moderators, the content that Reddit displays is “primarily driven by humans—not by centralized algorithms.”

Not true. Reddit mods enforce reddit rules, primarily. These rules have grown increasingly onerous over time. There is no more restrictive ruleset on reddit than those imposed by reddit inc. itself. This includes moderation by automoderation scripts written and enabled by volunteer moderators, at the behest(and their own) of reddit.

Indeed, if reddit moderators do not enforce the rules of reddit then they will lose their positions as moderators and likely be banned as well.

[+] xbar|3 years ago|reply
I fear a ruling where the interpretation of Section 230 puts dang at risk of liability for all of his necessary and appropriate historic moderation.
[+] PragmaticPulp|3 years ago|reply
It is interesting to read HN comments demanding more laws and regulations and restrictions on how social media operates. Usually the commenters are unaware that HN is a social media.

HN has algorithmic ranking, it has invisible moderation, it has shadowbanning (sort of), and it has YC sponsored ads injected into the “feed” that get special treatment relative to user submissions.

Many regulation proposals seem to have carve outs for sites and networks below a certain size, but if one past without such exceptions then a lot of the community sites we know and love would have no choice but to shut down.

I suspect a lot of the proponents of these regulations aren’t really interested in seeing the sites they like subjected to these regulations. It has almost become a talking point about punishing social media companies people don’t like others using.

[+] dragonwriter|3 years ago|reply
Weakening Section 230 doesn’t just put dang at risk, it puts every HN user that takes an action which alters the visibility of content, like flagging.
[+] shadowgovt|3 years ago|reply
This Court case largely isn't about that; the protections for moderation Section 230 grants are largely not in question here.

The question is whether an automated algorithm is protected by 230 in the same sense that manual moderation is. To the extent this might impact HN, it'd be more along the lines of "HN weights stories too heavily by (upvotes, time of post, some other metric) and as a result harm has occurred."

[+] andsoitis|3 years ago|reply
Rules will probably limited to communities over a certain size, larger than that of HN I would bet.
[+] prettyStandard|3 years ago|reply
I'm pretty new here. Can you elaborate? Edit: On the "necessary and appropriate historic moderation" part.
[+] squokko|3 years ago|reply
This is interesting because it provides a very clear geopolitical incentive for actors to acquire control of Reddit moderatorships - an anonymous and untraceable way to influence a Supreme Court decision.
[+] CamelCaseName|3 years ago|reply
Reddit isn't required to include every moderator's comments in their brief...
[+] BuckyBeaver|3 years ago|reply
Ironic that Reddit moderators, some of the biggest jagoffs on the Web, might end up helping to protect online freedom.