(no title)
javascriptfan69 | 10 days ago
Maybe worse since it is engineered to be as addictive as possible down to an individual level.
Then again maybe I'm being too optimistic that it will be fixed before it destroys us.
javascriptfan69 | 10 days ago
Maybe worse since it is engineered to be as addictive as possible down to an individual level.
Then again maybe I'm being too optimistic that it will be fixed before it destroys us.
blibble|10 days ago
the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved
treat the company as a traditional publisher
because they are, they're editorialising by selecting the content
vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230
carefulfungi|10 days ago
Off topic, but I bet a book on tobacco cultivation/history would be fascinating. Tobacco cultivation relied on the slave labor of millions and the global tobacco market influenced Jefferson and other American revolutionaries (who were seeing their wealth threatened). I've also read that Spain treated sharing seeds as punishable by death? The rare contrast that makes Monsanto look enlightened!
jballanc|10 days ago
In other words, that filter that keeps Nazis, child predators, doxing, etc. off your favorite platform only exists because of section 230.
Now, one could argue that the biggest platforms (Meta, Youtube, etc.) can, at this point, afford the cost of full editorial responsibility, but repealing section 230 under this logic only serves to put up a barrier to entry to any smaller competitor that might dislodge these platforms from their high, and lucrative, perch. I used to believe that the better fix would be to amend section 230 to shield filtering/removal, but not selective promotion, but TikTok has shown (rather cleverly) that selective filtering/removal can be just as effective as selective promotion of content.
jcgrillo|10 days ago
I'm not a lawyer, but idk that seems pretty clear cut. If you, the provider, run some program which does illegal shit then 230 don't cover your ass.
[1] https://www.congress.gov/crs-product/IF12584
ZeroGravitas|10 days ago
The Republicans are basically a coalition of corporate interests that want to get you addicted to stuff that will make you poor and unhealthy, and underling any collective attempt to help.
The previous vice-president claimed cigarettes don't give you cancer and the current president thinks wind turbine and the health problems caused by asbestos are both hoaxes. This is not a coincidence.
The two big times the Supreme Court flexed their powers were to shut down cigarette regulation by the FDA and Obama's Clean Power plan. Again, not a coincidence.
toss1|9 days ago
If there is an algorithm, the social media platform is exactly as responsible for the content as any publisher
If it is only a straight chronological feed of posts by actually followed accts, the social media platform gets Section 230 protections.
The social media platforms have gamed the law, gotten legitimate protections for/from what their users post, but then they manipulate it to their advantage more than any publisher.
>>the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved
>>treat the company as a traditional publisher
>>because they are, they're editorialising by selecting the content
>>vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230
hiddencost|10 days ago
quotemstr|10 days ago
There's nothing more anti-democratic than deciding that some votes don't count because the people casting them heard words you didn't like.
The kind of person to whom the concept of feed ranking threatening democracy is even a logical thought believes the role of the public is to rubber stamp policies a small group decides are best. If the public hears unapproved words, it might have unapproved thoughts, vote for unapproved parties, and set unapproved policy. Can't have that.
cruffle_duffle|10 days ago
The beautiful part is how non-partisan this is. It cooks all minds regardless of tribe.
nobody_r_knows|10 days ago
[deleted]
mort96|10 days ago
BoingBoomTschak|10 days ago
aix1|10 days ago
I agree 100%.
However, I think the core issue is not the use of an algorithm to recommend or even to show stuff.
I think the issue is that the algorithm is optimized for the interests of a platform (max engagement => max ad revenue) and not for the interests of a user (happiness, delight, however you want to frame it).
And there's way too much of this, everywhere.
randomNumber7|10 days ago
idiotsecant|10 days ago
That's not where it stops.
alfiedotwtf|10 days ago
cluckindan|10 days ago
timacles|10 days ago
PantaloonFlames|10 days ago