> The problem is that Facebook is giving the village idiot a megaphone
While you're not wrong that it's giving the idiot a megaphone, it's missing the greater picture. it's giving _everyone_ a megaphone. The real question is why can't people discern the difference between the idiot and the non-idiot?
I'd also note that a big issue now is trust -- trust in "elites" (technocrats, wealthy, those in positions of power) has been declining for a long time. i think people are not so much seeking out the village idiot, but massively discounting "experts".
A list of things that come to mind which have broken trust: 60's saw hippies which wanted to break norms of their parents/grandparents, 70s saw vietnam war, breaking gold standard, 80s greed is good, iran contra etc, 90s tough on crime policies, y2k fears, 00s - iraq/afghanistan, 9/11 attacks, governmental data dragnet, manning/snowden/asange, Covid statements which did not pan out as planned...
People have good reasons to be skeptical of elites, but I think anti-corruption work is more important than trying to silence the idiot.
> You guys gave them a megaphone, how do you expect society to behave?!
Considering most of humanity is... challenged when it comes to thinking critically, this should have been an entirely forseeable outcome. I agree it's society's fault, but Facebook is part of society. They watched how their tool was being usde by these people, and ENHANCED the reach of those messages because it was good for Facebook. Facebook is the microcosm of the object of it's blame. Idiocy writ large in recursion.
It's much worse than giving the village idiot a megaphone. Facebook (and most other socials) prioritize content to maximize engagement, and (big surprise) the village idiot maximizes engagement. Facebook is a machine tuned specifically to spread hate and bad ideas because that's what maximizes the time people spend on Facebook.
I thought of a good analogy a while back. Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement? Well that's the interaction and content that social media is going to amplify.
Social media companies are the tobacco companies of technology. They make billions by lobotomizing the body politic.
> The problem is that Facebook is giving the village idiot a megaphone
What's interesting is that before Facebook, the only people who could afford a megaphone were either state sponsored medias or billionaires who owned TV stations and newspapers.
For the ordinary citizens, the only way you could be heard was to write a letter to the editor of your local paper. If the state/billionaire/editor didn't like you, your views or anything really (your skin color perhaps?) it would simply not get published, period.
With Facebook a lot of gatekeeping simply disappeared. It's interesting to see who has an interest in regulating Facebook and bringing back the "good old days" of medias.
I am not sure how it goes for the average person. Myself: I just do not go to places where village idiots tend to accumulate like FB or if I do (hard for me not to watch youtube) I just completely ignore all that crap.
Should our society have free speech, or free speech for everyone except idiots?
If you agree with the second formulation, who do you think ought to be in charge of deciding who the idiots are? Surely Mark Zuckerberg would not be your first choice.
Maybe there is a third option: no free speech for anyone, all speech must be moderated for lies and misinformation. Is that what you want? In that case, who gets to decide what is true and what is not? Surely Zuckerberg wouldn't be your first choice for that either, right? And what should happen when Facebook blocks "misinformation" that turns out to actually be truthful?
Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.
But the blast radius of a Facebook post doesn't have the same reach given the majority of posts go to your explicit network of connections. Unless you're specifically referring to Facebook Groups? But then are we certain it's different from Reddit or other forums?
I think it's not so much Facebook alone but the entire Internet. The connectivity between humans is suddenly increased manyfold, and reaches much wider. Imagine using a graph layout tool on a giant graph with only few localized connections. Likely the picture will have evenly distributed nodes without much movement. But then as you dump all these new edges onto the graph, the nodes start to move into rigid clusters separated by weak boundaries. I think this is what's happening with the red/blue, vax/antivax etc. groups.
Yeah, I always hear people talking about the great "global village" where everyone is 'connected', but I have to admit I am against it. I don't want to be prank called.
Right. Prior to social media, people were vetted many ways and in every context in which they gained an audience. (e.g. earned standing in social settings and community groups, promotions at work, editors of one sort or another when publishing to a group, etc) Audiences grew incrementally as people earned their audience. Social media removed all that vetting and it inverted the criteria to grow an audience. Sensationalism was rewarded over thoughtfulness. So one of the most important tools we've always relied on to judge information was removed. Hard to believe, as intelligent as these folks at Facebook/Meta are said to be, that they don't understand this. Feels disingenuous.
The problem is that facebook is giving people earplugs. Biases and minority opinions get clustered together in huge echo chambers by eliminating mean societal influence.
This has assisted valid and invalid minority opinions to be heard.
What wasn’t there was critical thinking on behalf of the people who were already overwhelmingly exposed to mass political marketing and had developed a pseudo Asperger response. I will agree for once with the facebook exec, political philosophy has pretty much come to the conclusion that since there is not a unique definition of good or bad, there is not an algorithm that can do it.
The problem is that Gutenberg is giving the village idiot a megaphone. Gutenberg can't say:
- Amplify your commercial business message to billions of people worldwide.
AND at the same time
- Well its your individual choice whether or not to listen to the village idiot.
You guys gave them a megaphone, how do you expect society to behave?!
So should there be a special tax on "megaphones" like Twitter, Facebook or YouTube? What exactly is the legal framework under witch these companies could be scrutinized? Normally the manufacturer of megaphones does not get sued when a person uses it to promote hatred on a village square.
I wouldn't blame megaphones for the fact that "idiots" use them. Nor would I expect megaphone manufacturers to dictate what messages can be amplified using them. Nor would I expect megaphone retailers to determine somehow whether a person was an "idiot" before selling them a megaphone.
If someone uses a megaphone in an anti-social manner, that's a matter for the police to handle.
Internally Facebook works aggressively to combat covid misinformation: source I work at fb. Literally most of the commonly used datasets are about it. It's easy to hate and hard to understand.
Facebook decides what to show people. They could show you your friends posts in chronological order, and/or let people have control over what they see.
But no, Facebook decides what people see. Therefore they have some responsibility for the spread of misinformation.
And people decide to use Facebook. I am not trying to defend it, but blaming it 100% on Facebook is not fair. Even if their algorithms were perfect to amplify misinformation, there still needs to be enough people reading and sharing content for it to have an effect.
A solution could be paying for Facebook, where both the number of people and incentives would change.
Facebook doesn't really decide what you see, but instead optimizes what you see to maximize your engagement. If you never engage with political content or misinformation, you generally won't see it. Once you start engaging, it will drown out everything. What they could provide is a "no politics" option, but I wonder if anyone would utilize it. There was an old saying in the personalized recommendations world along the lines of "if you want to figure out what someone wants to read, don't ask them because they will lie." For instance, if you ask people what kinds of news they want they will invariably check "science" but in fact they will mostly read celebrity gossip and sports.
It’s not false that there is a societal problem that is not unique to Facebook.
But that sidesteps the question of what responsibility they have as a company whose profits are, at minimum, powered by that problem, if not exacerbating the problem.
“Privatize the profits, socialize the costs” is not sustainable.
Look at video games, particularly on mobile. I mean they aren't even games anymore. They're just metrics-optimized psychological-trick machines to extract the most money from you $1 at a time ie in-app purchases and pay-to-win. These aren't games: they're engagement bait to bring you and your wallet back each day.
Why do we have this? Because people suck and it just makes way too much money for anyone not to do it. Why didn't we have this 20 years? Because the technical capability wasn't there.
It's really no different here. Communication and messaging costs have really gone down to zero. If it wasn't FB, it'd be someone else. There's simply too much money with very little costs in engagement bait, whether or not that's the intent of the platform or product.
And yeah, that's the case because people suck. Most people aren't looking for verifiable information. They're looking for whatever or whoever says whatever it is they've already chosen to believe. That's it.
I'd say the biggest problem with FB and Twitter is sharing links as this is such an easy way for the lazy, ignorant and stupid to signals their preconceived notions to whatever audience they happen to have. But if Twitter or FB didn't allow sharing links, someone else would and that someone else would be more popular.
I honestly don't know what the solution to this is.
> Look at video games, [...] I mean they aren't even games anymore. They're just metrics-optimized psychological-trick machines to extract the most money from you $1 at a time [...]
“The NRA say that ‘guns don’t kill people, people kill people’. I think the gun helps. Just standing and shouting BANG! That’s not gonna kill too many people.”
- Eddie Izzard, Dress to Kill (1999)
Homo Sapiens have been murdering each other on this planet for at least 10,000 to 100,000 years, and have only used burning-powder projectile launch tubes for roughly the past 1000 years. (Poison darts launched from a blowgun are a more ancient form of killing tube.)
When convenient projectile launching killing tubes aren't available, Homo Sapiens will rapidly revert to 10,000+ year old murder methods, and thus a husband inflamed with murder-rage who just learned his wife's ovaries have been voluntarily fertilized by another man's semen will not infrequently use punches or a nearby blunt object (hammer or rock) to fracture her skull and destroy her brain function, or use his hands to crush her windpipe, or bleed her out with a knife. This has been happening essentially every year for at least the past 10,000 years. If his wife had been armed with a handheld projectile launching killing tube she could have defended herself, but women frequently don't carry projectile tubes and frequently vote for restricting access to projectile tubes, because projectile tubes are loud and scary and make them feel unsafe.
Izzard is from England which has a high degree of gun control. The murder rate doesn't seem to be strongly correlated to regulation, however. This lends less credence to Izzard's conjecture. Maybe people who may murder will use whatever tool is available or aren't concerned about breaking gun laws?
Title is really poor compared to the content of the article.
In any case, he is right. Look at the pattern, any large social network has these issues, which more or less seems like is related to how people interact. Twitter is massively toxic, Reddit is. Back in day Tumblr which was not current social media huge also used to have content Facebook gets blamed for.
Give a platform for people to publish and share and every opinion has the chance to be there.
It also doesn't have to be a massive broadcast platform, messaging platforms with small communities in the form of groups have these issues on a smaller scale. Though broadcast does make it worse.
Facebook chooses what I see while on their platform. If they didn't, I'd just see a chronological feed of my friends posts that I chose to follow as they came through without any external filtering. Going directly to friends walls shows that is not the case.
Instead, they amplify emotionally based content that they think I will react to (engagement) by studying previous interactions and don't show me things they don't agree with (censorship) even if it originated from an authoritative primary source. That doesn't sound like it originated in society, but more of a purposeful curation imposed on users, who have to conform if they want to stay. I didn't.
Yeah I don't think someone in that role from fb is particularly qualified to talk about society and human nature in relationship to social networking.
It's like listening to someone who builds, designs and optimizes production lines in cigarette factories philosophize about why people smoke and whether it is their free choice to do so.
Facebook execs are not responsible for what people think, but they aren't neutral either.
The connection between incentives ends generating a situation where their decisions had a huge influence on society:
- Their goal is to make the company profitable, and they choose ads as the business model.
- Without viewers, there are no advertisers. So, engagement is key.
- They need to create incentives to make people both content creators and followers: share your thoughts, share your photos, and show us what you like.
- Content creation is hard and strong opinions attract people (both detractors and followers).
- A long post format doesn't work for casual engagement, and the UI is optimized for a quick scan (because that helps with engagement).
The result is short posts of shitty content with very strong opinions that create an echo chamber. Can they get out of that trap? I don't know. I've seen good quality content in smaller online communities. (for example, while HN is not small, the quality of the comments is usually better than the article itself). But, I'm suspicious that optimizing for profit contradicts content quality. Something similar happens with TV shows. TV networks increased the number of reality shows: they are cheap to produce, generate strong emotions, and have a higher immediate audience than a high-quality TV series. The high-quality TV series came from media companies like HBO or Netflix because they don't care about optimizing minute-to-minute ratings (they care more about exclusives to attract subscribers).
On the one hand people have free will to believe what they want to - and - apparently - Facebook has no influence or responsibility on that.
On the other hand Facebook is entirely in the business of selling influence to change what people believe.
The meta is that this is a piece trying to influence what people believe about Facebook's influence.
I guess that makes this meta about meta being meta about meta.
Outrage against Facebook being too influential is marketing for Facebook adverts. It's a logical PR strategy. There's a perverse incentive to do it for real, and for Facebook to cause actual harm.
It doesn't matter if anyone in Facebook actually believes that (following a perverse incentive is a good idea). All that needs to happen is for the incentives to be aligned that way. Which might literally be the famed "optimising for engagement". https://www.youtube.com/watch?v=hn1VxaMEjRU
Not FB responsibility if they set up and tuned an information spreading system that promotes stuff that's inflammatory over stuff that's informative? Users of FB only see what FB feeds to them, and that's all about how FB aligns the user's activities and characteristics to the content FB is supplying. What a total cop out to say, the problem is what people say, when FB plays such a crucial role in what people see. Before FB (yes and other social media) amped this stuff up, a village idiot standing on a corner shouting conspiracy theories got very little attention. But on FB this kind of stuff feeds engagement, and we know how important engagement is.
Yet the guy who's slated to by FB's CTO says, don't put all this inflammatory stuff on me! Freedom of speech you know and just let us do our job of promoting engagement and building ever more effective ad targeting technology!
FB shows people what they want to see. They provided the public a channel to tune into what anyone has to say and it turned out that people sought out the village idiot.
Now the public says show me what I want, but prevent everyone else from seeing what they want.
Mental health has been an issue for as long as we've known, but Facebook does have a curious way of amplifying societal problems such as this and making it worse.
"When you’re young, you look at television and think, There’s a
conspiracy. The networks have conspired to dumb us down. But
when you get a little older, you realize that’s not true. The networks
are in business to give people exactly what they want. That’s a far
more depressing thought." - Steve Jobs.
Imagine if Facebook, Twitter, YouTube, etc were run like MMORPGs. Imagine them proactively mitigating griefing, bots, brigading, etc.
John Siracusa has been making this point on Accidental Tech Podcast (http://atp.fm): Zuck's envisioned metaverse would also be a toxic hellscape. Because Zuck is ideologically opposed to moderation.
The difference, of course, is because social medias sell advertising. Whereas MMOPRGs sell experiences.
The Fb algorithm favors engagement and controversy to the point that most people may not even see reasonable takes on a given issue. They’re not neutral.
Maybe if facebook’s graph were designed differently so that you have circles of relationships: family, friends, acquaintances, business relationships, interests and everyone else, but by default as the relationships are closer to the periphery, the less they get promoted. The smaller the circle the higher the chance of promotion.
That way idiocy’s doesn’t spread high and far and instead has limited transmission radius.
The medium is the message as the saying goes, the technology we interact with shapes what kinds of interaction we have. Blaming 'people' makes no sense because 'people' cannot be changed, unless we genetically reengineer humanity to be more accommodating to Facebook's algorithms, which they would probably prefer compared to having to actually fix the problems of their platform
>"At some point the onus is, and should be in any meaningful democracy, on the individual"
Viewing systems issues that are macro at scale through an individualized lens is great for dodging responsibility and Facebook's bottom line but it makes about as much sense as thinking that dealing with climage change will be achieved by everyone voluntarily shopping greener on amazon rather than creating systems that are, collectively and at a mechanism level oriented towards social good
Facebook is trying to save the internet culture of 20 years ago. A teenager version of me would 100% support what they are trying to do. But mainstream society is not compatible with the internet mentality of back in the day
[+] [-] mbesto|4 years ago|reply
- Amplify your commercial business message to billions of people worldwide.
AND at the same time
- Well its your individual choice whether or not to listen to the village idiot.
You guys gave them a megaphone, how do you expect society to behave?!
[+] [-] maerF0x0|4 years ago|reply
While you're not wrong that it's giving the idiot a megaphone, it's missing the greater picture. it's giving _everyone_ a megaphone. The real question is why can't people discern the difference between the idiot and the non-idiot?
I'd also note that a big issue now is trust -- trust in "elites" (technocrats, wealthy, those in positions of power) has been declining for a long time. i think people are not so much seeking out the village idiot, but massively discounting "experts".
A list of things that come to mind which have broken trust: 60's saw hippies which wanted to break norms of their parents/grandparents, 70s saw vietnam war, breaking gold standard, 80s greed is good, iran contra etc, 90s tough on crime policies, y2k fears, 00s - iraq/afghanistan, 9/11 attacks, governmental data dragnet, manning/snowden/asange, Covid statements which did not pan out as planned...
People have good reasons to be skeptical of elites, but I think anti-corruption work is more important than trying to silence the idiot.
[+] [-] burnte|4 years ago|reply
Considering most of humanity is... challenged when it comes to thinking critically, this should have been an entirely forseeable outcome. I agree it's society's fault, but Facebook is part of society. They watched how their tool was being usde by these people, and ENHANCED the reach of those messages because it was good for Facebook. Facebook is the microcosm of the object of it's blame. Idiocy writ large in recursion.
[+] [-] api|4 years ago|reply
I thought of a good analogy a while back. Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement? Well that's the interaction and content that social media is going to amplify.
Social media companies are the tobacco companies of technology. They make billions by lobotomizing the body politic.
[+] [-] 908B64B197|4 years ago|reply
What's interesting is that before Facebook, the only people who could afford a megaphone were either state sponsored medias or billionaires who owned TV stations and newspapers.
For the ordinary citizens, the only way you could be heard was to write a letter to the editor of your local paper. If the state/billionaire/editor didn't like you, your views or anything really (your skin color perhaps?) it would simply not get published, period.
With Facebook a lot of gatekeeping simply disappeared. It's interesting to see who has an interest in regulating Facebook and bringing back the "good old days" of medias.
[+] [-] s1artibartfast|4 years ago|reply
Facebook prioritizes what people want to see, and people want to see train wrecks and inflammatory content.
[+] [-] TigeriusKirk|4 years ago|reply
[+] [-] FpUser|4 years ago|reply
[+] [-] twblalock|4 years ago|reply
If you agree with the second formulation, who do you think ought to be in charge of deciding who the idiots are? Surely Mark Zuckerberg would not be your first choice.
Maybe there is a third option: no free speech for anyone, all speech must be moderated for lies and misinformation. Is that what you want? In that case, who gets to decide what is true and what is not? Surely Zuckerberg wouldn't be your first choice for that either, right? And what should happen when Facebook blocks "misinformation" that turns out to actually be truthful?
Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.
[+] [-] canistr|4 years ago|reply
But the blast radius of a Facebook post doesn't have the same reach given the majority of posts go to your explicit network of connections. Unless you're specifically referring to Facebook Groups? But then are we certain it's different from Reddit or other forums?
[+] [-] sharadov|4 years ago|reply
[+] [-] foobarian|4 years ago|reply
[+] [-] servytor|4 years ago|reply
[+] [-] commandlinefan|4 years ago|reply
One man's terrorist is another man's freedom fighter.
[+] [-] jeffrogers|4 years ago|reply
[+] [-] antman|4 years ago|reply
This has assisted valid and invalid minority opinions to be heard.
What wasn’t there was critical thinking on behalf of the people who were already overwhelmingly exposed to mass political marketing and had developed a pseudo Asperger response. I will agree for once with the facebook exec, political philosophy has pretty much come to the conclusion that since there is not a unique definition of good or bad, there is not an algorithm that can do it.
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] JPKab|4 years ago|reply
- Well its your individual choice whether or not to listen to the village idiot.
You guys gave them a megaphone, how do you expect society to behave?!
[+] [-] freediver|4 years ago|reply
[+] [-] betwixthewires|4 years ago|reply
[+] [-] kjgkjhfkjf|4 years ago|reply
If someone uses a megaphone in an anti-social manner, that's a matter for the police to handle.
[+] [-] baq|4 years ago|reply
[+] [-] alpineidyll3|4 years ago|reply
[+] [-] mikem170|4 years ago|reply
Facebook decides what to show people. They could show you your friends posts in chronological order, and/or let people have control over what they see.
But no, Facebook decides what people see. Therefore they have some responsibility for the spread of misinformation.
[+] [-] andrew_|4 years ago|reply
[+] [-] freediver|4 years ago|reply
A solution could be paying for Facebook, where both the number of people and incentives would change.
[+] [-] hardtke|4 years ago|reply
[+] [-] kgin|4 years ago|reply
But that sidesteps the question of what responsibility they have as a company whose profits are, at minimum, powered by that problem, if not exacerbating the problem.
“Privatize the profits, socialize the costs” is not sustainable.
[+] [-] cletus|4 years ago|reply
Why do we have this? Because people suck and it just makes way too much money for anyone not to do it. Why didn't we have this 20 years? Because the technical capability wasn't there.
It's really no different here. Communication and messaging costs have really gone down to zero. If it wasn't FB, it'd be someone else. There's simply too much money with very little costs in engagement bait, whether or not that's the intent of the platform or product.
And yeah, that's the case because people suck. Most people aren't looking for verifiable information. They're looking for whatever or whoever says whatever it is they've already chosen to believe. That's it.
I'd say the biggest problem with FB and Twitter is sharing links as this is such an easy way for the lazy, ignorant and stupid to signals their preconceived notions to whatever audience they happen to have. But if Twitter or FB didn't allow sharing links, someone else would and that someone else would be more popular.
I honestly don't know what the solution to this is.
[+] [-] GuB-42|4 years ago|reply
Did you just describe arcades?
[+] [-] Bendy|4 years ago|reply
[+] [-] dandotway|4 years ago|reply
When convenient projectile launching killing tubes aren't available, Homo Sapiens will rapidly revert to 10,000+ year old murder methods, and thus a husband inflamed with murder-rage who just learned his wife's ovaries have been voluntarily fertilized by another man's semen will not infrequently use punches or a nearby blunt object (hammer or rock) to fracture her skull and destroy her brain function, or use his hands to crush her windpipe, or bleed her out with a knife. This has been happening essentially every year for at least the past 10,000 years. If his wife had been armed with a handheld projectile launching killing tube she could have defended herself, but women frequently don't carry projectile tubes and frequently vote for restricting access to projectile tubes, because projectile tubes are loud and scary and make them feel unsafe.
[+] [-] adolph|4 years ago|reply
https://www.macrotrends.net/countries/GBR/united-kingdom/mur...
https://en.wikipedia.org/wiki/Firearms_regulation_in_the_Uni...
[+] [-] actuator|4 years ago|reply
In any case, he is right. Look at the pattern, any large social network has these issues, which more or less seems like is related to how people interact. Twitter is massively toxic, Reddit is. Back in day Tumblr which was not current social media huge also used to have content Facebook gets blamed for.
Give a platform for people to publish and share and every opinion has the chance to be there.
It also doesn't have to be a massive broadcast platform, messaging platforms with small communities in the form of groups have these issues on a smaller scale. Though broadcast does make it worse.
[+] [-] cronix|4 years ago|reply
Instead, they amplify emotionally based content that they think I will react to (engagement) by studying previous interactions and don't show me things they don't agree with (censorship) even if it originated from an authoritative primary source. That doesn't sound like it originated in society, but more of a purposeful curation imposed on users, who have to conform if they want to stay. I didn't.
[+] [-] scyzoryk_xyz|4 years ago|reply
It's like listening to someone who builds, designs and optimizes production lines in cigarette factories philosophize about why people smoke and whether it is their free choice to do so.
[+] [-] diegof79|4 years ago|reply
The connection between incentives ends generating a situation where their decisions had a huge influence on society:
- Their goal is to make the company profitable, and they choose ads as the business model.
- Without viewers, there are no advertisers. So, engagement is key.
- They need to create incentives to make people both content creators and followers: share your thoughts, share your photos, and show us what you like.
- Content creation is hard and strong opinions attract people (both detractors and followers).
- A long post format doesn't work for casual engagement, and the UI is optimized for a quick scan (because that helps with engagement).
The result is short posts of shitty content with very strong opinions that create an echo chamber. Can they get out of that trap? I don't know. I've seen good quality content in smaller online communities. (for example, while HN is not small, the quality of the comments is usually better than the article itself). But, I'm suspicious that optimizing for profit contradicts content quality. Something similar happens with TV shows. TV networks increased the number of reality shows: they are cheap to produce, generate strong emotions, and have a higher immediate audience than a high-quality TV series. The high-quality TV series came from media companies like HBO or Netflix because they don't care about optimizing minute-to-minute ratings (they care more about exclusives to attract subscribers).
[+] [-] snthd|4 years ago|reply
On the one hand people have free will to believe what they want to - and - apparently - Facebook has no influence or responsibility on that.
On the other hand Facebook is entirely in the business of selling influence to change what people believe.
The meta is that this is a piece trying to influence what people believe about Facebook's influence.
I guess that makes this meta about meta being meta about meta.
Outrage against Facebook being too influential is marketing for Facebook adverts. It's a logical PR strategy. There's a perverse incentive to do it for real, and for Facebook to cause actual harm.
It doesn't matter if anyone in Facebook actually believes that (following a perverse incentive is a good idea). All that needs to happen is for the incentives to be aligned that way. Which might literally be the famed "optimising for engagement". https://www.youtube.com/watch?v=hn1VxaMEjRU
[+] [-] Booktrope|4 years ago|reply
Yet the guy who's slated to by FB's CTO says, don't put all this inflammatory stuff on me! Freedom of speech you know and just let us do our job of promoting engagement and building ever more effective ad targeting technology!
[+] [-] s1artibartfast|4 years ago|reply
Now the public says show me what I want, but prevent everyone else from seeing what they want.
[+] [-] jensensbutton|4 years ago|reply
The problem is who's to say which is which.
[+] [-] ricardoplouis|4 years ago|reply
https://www.wsj.com/articles/facebook-knows-instagram-is-tox...
[+] [-] nimrody|4 years ago|reply
[+] [-] specialist|4 years ago|reply
John Siracusa has been making this point on Accidental Tech Podcast (http://atp.fm): Zuck's envisioned metaverse would also be a toxic hellscape. Because Zuck is ideologically opposed to moderation.
The difference, of course, is because social medias sell advertising. Whereas MMOPRGs sell experiences.
[+] [-] alphabettsy|4 years ago|reply
[+] [-] mc32|4 years ago|reply
That way idiocy’s doesn’t spread high and far and instead has limited transmission radius.
[+] [-] Barrin92|4 years ago|reply
>"At some point the onus is, and should be in any meaningful democracy, on the individual"
Viewing systems issues that are macro at scale through an individualized lens is great for dodging responsibility and Facebook's bottom line but it makes about as much sense as thinking that dealing with climage change will be achieved by everyone voluntarily shopping greener on amazon rather than creating systems that are, collectively and at a mechanism level oriented towards social good
[+] [-] barefeg|4 years ago|reply
[+] [-] imapeopleperson|4 years ago|reply
[+] [-] mensetmanusman|4 years ago|reply
The losers were ignored by the winners creating huge gaps in trust.
The winners used to utilize the few media institutes that they controlled to keep a lid on discontent.
Social media has complicated the situation.