(no title)
twright0 | 2 years ago
Facebook appears to have disagreed with that; they amplified calls for ethnic cleansing and did not respond to concerns about it, so they must have believed that asking them not to was too much. That's the point.
twright0 | 2 years ago
Facebook appears to have disagreed with that; they amplified calls for ethnic cleansing and did not respond to concerns about it, so they must have believed that asking them not to was too much. That's the point.
roenxi|2 years ago
Bad people use Facebook. We don't need evidence to know that. This article is strong evidence that very bad people use Facebook, but it isn't at all clear that Facebook should be considered morally involved based on what has been presented seen so far.
Maybe the killing blow is yet to come. But I'm pretty sure any objective standard that gets Facebook in trouble here will get them in just as much trouble for letting Victoria Newland or US 4 star generals post publicly. There are a lot if brutes in public office.
Furthermore getting involved in matters of war and peace is not a role that Facebook will get praise for, it'll do some really terrible things if it goes down that path. They should be biased towards inaction. Even and especially if they care.
dotandgtfo|2 years ago
Facebook de facto became the internet in a country of ~50 million people through subsidising their platform through free data access.
Their platform was developed in order to further their own goals - through maximising engagement and monetisation.
The second order effects of their own personal ambition was enabling people like Wirathu to reach hundreds of thousands of people with hate speech and calls for genocide.
Facebook were informed of this multiple times and allegedly, did nothing about. During this time they had 1 Burmese speaking moderator.
Stating that they have no moral responsibility for the consequences of their actions is in my opinion horseshit. But it does align with certain aspects of the current American zeitgeist of entrepreneurship, free speech and platform "safe harbour" regulations.
This is not a view shared everywhere and should not be assumed when American tech companies scale out of the US. Thankfully this dogmatic approach is being regulated by the likes of the EU and other countries so these platforms are more aligned with their own moral frameworks.
Personally, I find Facebook absolutely morally responsible for parts of this. Just through the simple fact that provided a platform for tens of millions of people - with severely lacking moderation - all in the chase of growth and profits.
This isn't exporting "freedom and democracy" to the world like the good old days. This is abhorrent profit maximisation with no regards for the consequences of their actions, hidden behind a thin veneer of moral rationalization.
beebeepka|2 years ago
Stuff like this does not happen by accident, nor in a vacuum. The only reason we don't hear more about this is that important people don't want us to.
Don't play the naivety card in topics like this.
Edi: downvote all you want, you horrible apologists. FB is a weapon and you know it.