top | item 39472657

(no title)

j-james | 2 years ago

I cannot see how BlueSky's moderation system can ever work. Decoupling moderation and hosting means there's no onus to do the moderation that they describe: which makes me think it will be BlueSky Inc., and only other corporations, that have resources to throw employees at a now thankless, Facebook-style moderation job. And instances have to moderate anyway, in order to not host illegal content.

discuss

order

steveklabnik|2 years ago

I hear you on some level. That said, we are already seeing people creating blocklists, and tools to share them with others. That is happening alongside the company's investment in paying people to work on T&S related issues on their instance.

I am not sure if it will succeed or fail, but I am interested to see how it plays out.

j-james|2 years ago

That relies upon the benevolence of corporations to much more of an extent than I am comfortable with. 20 years of social media has convinced me that that's a bad idea. And, I think, it removes much of the benefits of federation: if the only way to sustainably moderate is to rely upon gifts from BlueSky Inc., moderation is going to be necessarily dependent upon them.

dorfsmay|2 years ago

Blacklists feel more like reinforcing the echo chamber than moderation.

shkkmo|2 years ago

> Decoupling moderation and hosting means there's no onus to do the moderation that they describe:

I'm not sure this follows. There is a similarity to the reddit model of moderation. The host provides some base amount of moderation but supplemental moderation comes from members of the community. In the Bluesky model, a 'subreddit' is analagous to an indexer/aggregator (aka Relay/AppView) that provides a moderated and/or weighted feed of content. The same incentives for volunteer mods on Reddit will exist for volunteer mods on Bluesky.

numpad0|2 years ago

One of difficulties with content moderation is it's been targeted by some as a tool available for the few to control and shape public opinions to far narrower degrees than legally required, which is harmful to free speech. I'm not completely sure but externalizing that part probably mitigates that issue a bit.

EU is moving towards requiring all social media obey EU laws, under loose notion that their laws is the least restrictive and most reasonable. No one is, and the sum of all ethical standards on Earth is not going to be something very popular, so that's nonsense. OTOH, it's perfectly reasonable that content served at scale in a region will have to be lawful; "this content you want removed is lawful in MY country" is sort of nonsense too. So moderation decoupling and, ahem, moderation localization is going to be necessary for social media. I suppose that's where they're going.

timeon|2 years ago

Interesting that you have picked EU, while sites like Twitter are already blocking or removing content on request of countries like Turkey, China or Russia.

hnbad|2 years ago

Communities are built on shared values and expectations of what is or isn't acceptable conduct. If a guest to your club house starts pooping on the carpet, you throw them out not only because you don't want that to happen in your club house but also because throwing them out demonstrates to the other people in your club house that they can expect there to be actual consequences to that kind of behavior, allowing them to feel safe knowing that they won't have to worry about it. Bluesky's solution apparently boils down to just telling everyone to ignore the poop guy and giving them the option to not be able to see him.

The problem with censorship isn't the enforcement of rules. The problem with censorship is the enforcement of rules the individual that has to enforce them doesn't agree with. Free speech absolutism on social media is often argued for with appeals to "the town square" but the difference between social media and an actual town square is that if you make a complete ass out of yourself in an actual town square, eventually someone will punch you.