top | item 47095863

(no title)

javascriptfan69 | 10 days ago

I genuinely think we will look back at the algorithmic content feed as being on par with leaded gasoline or cigarettes in terms of societal harm.

Maybe worse since it is engineered to be as addictive as possible down to an individual level.

Then again maybe I'm being too optimistic that it will be fixed before it destroys us.

discuss

order

blibble|10 days ago

I think it's worse, cigarettes never threatened democracy

the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved

treat the company as a traditional publisher

because they are, they're editorialising by selecting the content

vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230

carefulfungi|10 days ago

> cigarettes never threatened democracy

Off topic, but I bet a book on tobacco cultivation/history would be fascinating. Tobacco cultivation relied on the slave labor of millions and the global tobacco market influenced Jefferson and other American revolutionaries (who were seeing their wealth threatened). I've also read that Spain treated sharing seeds as punishable by death? The rare contrast that makes Monsanto look enlightened!

jballanc|10 days ago

The problem with this is that section 230 was specifically created to promote editorializing. Before section 230, online platforms were loath to engage in any moderation because they feared that a hint of moderation would jump them over into the realm of "publisher" where they could be held liable for the veracity of the content they published and, given the choice between no moderation at all or full editorial responsibility, many of the early internet platforms would have chosen no moderation (as full editorial responsibility would have been cost prohibitive).

In other words, that filter that keeps Nazis, child predators, doxing, etc. off your favorite platform only exists because of section 230.

Now, one could argue that the biggest platforms (Meta, Youtube, etc.) can, at this point, afford the cost of full editorial responsibility, but repealing section 230 under this logic only serves to put up a barrier to entry to any smaller competitor that might dislodge these platforms from their high, and lucrative, perch. I used to believe that the better fix would be to amend section 230 to shield filtering/removal, but not selective promotion, but TikTok has shown (rather cleverly) that selective filtering/removal can be just as effective as selective promotion of content.

jcgrillo|10 days ago

> As interpreted by some courts, this language preserves immunity for some editorial changes to third-party content but does not allow a service provider to "materially contribute" to the unlawful information underlying a legal claim. Under the material contribution test, a provider loses immunity if it is responsible for what makes the displayed content illegal.[1]

I'm not a lawyer, but idk that seems pretty clear cut. If you, the provider, run some program which does illegal shit then 230 don't cover your ass.

[1] https://www.congress.gov/crs-product/IF12584

ZeroGravitas|10 days ago

You can draw a fairly clear line from the corporate response to cigarettes being regulated through to the strategy for climate change and social media/crypto etc.

The Republicans are basically a coalition of corporate interests that want to get you addicted to stuff that will make you poor and unhealthy, and underling any collective attempt to help.

The previous vice-president claimed cigarettes don't give you cancer and the current president thinks wind turbine and the health problems caused by asbestos are both hoaxes. This is not a coincidence.

The two big times the Supreme Court flexed their powers were to shut down cigarette regulation by the FDA and Obama's Clean Power plan. Again, not a coincidence.

toss1|9 days ago

THIS, EXACTLY!

If there is an algorithm, the social media platform is exactly as responsible for the content as any publisher

If it is only a straight chronological feed of posts by actually followed accts, the social media platform gets Section 230 protections.

The social media platforms have gamed the law, gotten legitimate protections for/from what their users post, but then they manipulate it to their advantage more than any publisher.

>>the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved

>>treat the company as a traditional publisher

>>because they are, they're editorialising by selecting the content

>>vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230

hiddencost|10 days ago

They fought a civil war over the labor required to produce tobacco.

quotemstr|10 days ago

Social media cannot "threaten democracy". Democracy means that we transfer power to those who get the most votes.

There's nothing more anti-democratic than deciding that some votes don't count because the people casting them heard words you didn't like.

The kind of person to whom the concept of feed ranking threatening democracy is even a logical thought believes the role of the public is to rubber stamp policies a small group decides are best. If the public hears unapproved words, it might have unapproved thoughts, vote for unapproved parties, and set unapproved policy. Can't have that.

cruffle_duffle|10 days ago

> never threatened democracy

The beautiful part is how non-partisan this is. It cooks all minds regardless of tribe.

mort96|10 days ago

Why change section 230? You can just make personalized algorithmic feeds optimized for engagement illegal instead, couldn't you? What advantage does it have to mess with 230, wouldn't the result be the same in practice?

BoingBoomTschak|10 days ago

If your tree is so weak that a single breeze can knock it off, why blame the wind? Disclaimer: I hate social media of all kinds, it's just that you're missing the forest.

aix1|10 days ago

> we will look back at the algorithmic content feed as being on par with leaded gasoline or cigarettes in terms of societal harm

I agree 100%.

However, I think the core issue is not the use of an algorithm to recommend or even to show stuff.

I think the issue is that the algorithm is optimized for the interests of a platform (max engagement => max ad revenue) and not for the interests of a user (happiness, delight, however you want to frame it).

And there's way too much of this, everywhere.

randomNumber7|10 days ago

We live in a society that only values money so why should anyone optimise for s.th. else?

idiotsecant|10 days ago

If anything the algorithmic dopamine drip is just getting started. We haven't even entered the era of intensely personalized ai-driven individual influence campaigns. The billboard is just a billboard right now, but it won't be long before the billboard knows the most effective way to emotionally influence you and executes it perfectly. The algorithm is mostly still in your phone.

That's not where it stops.

alfiedotwtf|10 days ago

It’s crazy (but true) to think that by slowly manipulating someone’s feed, Zuck and Musk could convert people’s religions, political leanings, personal values, etc with little work. In fact, I would be surprised if there was NOT some part of Facebook and Twitter’s admin or support page where a user’s “preferences” could be modified i.e “over the next 8 months, convert the user to a staunch evangelical Christian” etc

cluckindan|10 days ago

FB was always conversion as a service

timacles|10 days ago

Yeah might not ever get fixed. It is the perfect tool for mass influence and surveillance of the people. The powers that he would never let it go

PantaloonFlames|10 days ago

It's literally why Leon bought Twitter. A Mass influence vehicle.