To answer the question at the end of the post: probably none.
As others have said, it's incredibly unlikely that a tweet is going to change your mind about anything.
Having worked both in and out of the journalism industry, I know that journalists get a lot more worked up about what people say online than everyone else does. They think it matters a lot more than it does (probably why they got into the job of saying things online in the first place). Twitter is completely irrelevant to most people. They're vaguely aware it exists, but it really doesn't matter.
In TFA, you can see each tweet getting a few hundred to a few thousand likes/retweets, and most of those identified by the analysts as bots. So each tweet is maybe reaching an audience of a couple hundred humans, at best (and it'll be the same few hundred humans for all such tweets). How many of those humans are going to change their vote because of that tweet?
It's irrelevant.
But it is dangerous. Politicians get very nervous about elections. If the politicians are persuaded that any of this matters, then they'll be more inclined to stop it mattering. And that means laws that curb free speech online, monitor communication, prevent encryption, and all the rest of the shitstorm we're facing.
I think you might be underestimating the power of seeing some very subtle opinions over and over online.
In my country I've been seeing a constant campaign of subtle disinformation against opposition figures that is very powerful. You check the "user's" Twitter profile and they'll have a bunch of pro-opposition opinions on it. Then, the "user" will start to express disillusionment about certain people, asking them not to betray "our side" etc. By the end, it's a full blown attack against "traitors".
This kind of stuff is very subtle and powerful. I've seen it over and over and people will get dragged into believing this stuff and it's very damaging.
It's almost stuff straight out of the movie "Inception". You just need to plant that "seed" of doubt and it will grow in a lot people. It might just be the nature of our political climate, so it might not work everywhere though.
You are right that normal people don't read Twitter.
But Twitter users read Twitter. And there are many of them. And they amplify what each other says, conspiracy theories, telephone game, and intentional disinformation alike.
And… eventually some of that disinformation reaches critical mass and graduates from the disinformation Petri dish of twitter.com and releases spores into the mindspace of journalists, "journalists", and normal people who happen to be on Twitter sometimes.
And then, having evolved into a truth-resistant strain, that disinformation spreads through Facebook, InfoWars, well-meaning television news outlets, and television news outlets owned by Sinclair Broadcast Group, into memorable one-liners that reinforce one's own beliefs just enough to avoid application of even the hint of rational thought, which get tucked away in that special space of the mind used to store those juicy "gotchas" you're saving up for next Thanksgiving.
And come Thanksgiving, you use them – take that, Hillary-loving millennial nephew! And, you find out they're wrong, because of course they're wrong, they're ridiculous – why would Hillary Clinton be performing satanic rituals underneath a pizza place? – but it doesn't matter. You've spent the past year reinforcing your beliefs because you half-remembered that you heard somewhere a vague description that Hillary did this awful thing, and you told your friends and it reinforced their beliefs, because even if it made no sense to you, it was vague enough to make sense to them, and, well, she probably did something else terrible anyway, and in fact, your just buddy told you about how Wall Street is paying her to get rid of US borders. Sounds about right!
This is the opposite of how reality works for a big chunk of the population, people reading the same lies multiple times (read "tweets" or any constant influx of information) is how we got anti-vaxxers, flat-earthers, and is the reason politicians many times spend money on ads with their names only (0 promises, 0 slogans, just their name, but it works)
> As others have said, it's incredibly unlikely that a tweet is going to change your mind about anything.
Yes, but multiply that by the number of targets, then by the number of messages, and then by exposures and you'll see that very small chance turn into a massive tide. That's how advertising works.
We have pretty significant evidence over the last decade indicating otherwise. The RAND institute did an entire paper about it.
>If the politicians are persuaded that any of this matters, then they'll be more inclined to stop it mattering. And that means laws that curb free speech online
Pretty sure misinforming the voting public matters? Uneditorialized journalism and publishing created this issue. Don't try to pretend any law addressing this problem is somehow constraining the free speech of the public.
While I don't want to imply disinformation will always sway an election, I think you may be drastically underestimating the number of people a disinformation campaign needs to reach.
In the US 2018 congressional races many of them were decided by an incredibly small numbers of voters, a tremendous amount were decided by less than 5% and quite a few were decided by margins of a fraction of a percent.
I'm literally in a car leaving for the airport so I can't verify the exact numbers, but I'm confident I'm close enough[1]: Kansas' 2nd district was decided by like .87%; Florida's US Senate Rick Scott won by .12%; Georgia 7th was .15%; Minnesota 1st was for sure .45%; Illinois' 13th district was something like .77%; New York's 27th district was like .28%. In 2016 presidential race, New Hampshire was decided by like 1500 votes. That's just a sample of the extremely close races across the country.
Now, while those races might seem concerning, it gets even more concerning when we consider smaller districts, county elections, and city elections where it isn't uncommon for seats to be decided by margins of 25 people or less.
The worrying thing about disinformation campaigns isn't that the disinformation is going to suddenly convert 90% of our population into propaganda vessels, it's our modern ability to target smaller and smaller demographics in key locations with information in ways that we previously were unable to do in cost effective ways. The concern (and it is a concern) is that the disinformation and often intentional outright lies will misinform just tiny tiny fraction of a percent of vulnerable logic challenged people in key areas where the ripple effects punch outside their weight.
And also of course how easy it is to swing local county and city elections, considering that a rather high ratio of local voters tend to be elderly people who may not have their skeptical hats on when the magic box tells them the opposing party is eating babies or whatever.
If you've spent any time having long casual conversations with many of our voters, you'd find out that many of these people have been convinced of and believe in truly outlandish things, they want to believe in fantastic explanations rather than what is usually a mundane explanation. We don't have to look far to find outlandish ideas floating around our population--just consider anti-vax and how it spread mostly organically, it was not a highly funded and highly targeted campaign.
Our people can make rational decisions and vote for their interests, provided they are given solid information and are not exposed to outright disinformation. It doesn't matter which direction people vote, as long as their decisions are not based on disinformation spread intentionally.
[1] I'll verify those numbers once I'm settled in, but I am confident they're within a reasonable accuracy for this topic. If you're still not convinced of the larger implications here, I'd be happy to list more. There were a whole shitload of key counties where elections were decided by margins of less than 10 people and quite a few were less than 5.
If, as you suggest, opinions expressed online don't have the power to change anyone's mind, what does it matter when their free expression is curbed by new laws?
It’s useful to know details such as these, but the bigger problem is that the audience that most “enjoys” and retweets this stuff honestly doesn’t care that it is false.
I personally know people that, when provided clear evidence of entirely false and manipulative stories will justify them by saying, “well, this story may not be true, but there are a lot of stories like this that you haven’t seen which are true”.
The point is, they believe what they want to believe, and no amount of proof to the contrary will change their mind. In fact, they often just double down on their arguments.
Sure, that's normal human pleasure from having their beliefs reinforced. It's probably and even bigger problem for vague unfalsifiable ideas like "police are racist", "capitalism is evil" or "immigration is bad". If a news story appears to support such a belief, the person will have their belief reinforced, even if it's wrong and the (true) news story is an exception. One photo of an immigrant vandalizing a statue or one photo of a border guard making a child cry says "See? You were right! That whole political party is bad!".
> but the bigger problem is that the audience that most “enjoys” and retweets this stuff honestly doesn’t care that it is false. [...] The point is, they believe what they want to believe, and no amount of proof to the contrary will change their mind
All people are like that, just everyone for different topics. I heard so many people say "There are not enough resources on earth so that everyone can have a mobile phone" or the like, which obviously is false. Don't get me started on all the beliefs of lefts and rights in the refugee crisis in Germany. Or take "the free market solves it all" from many liberals, "capitalism is evil" from young contrarians and whatever conservative thoughts from older folks.
My point is, the problem of blind belief is not specific to the audience of those tweets, it seems human nature.
It feels like 2016 all over again, with (some) people desperate for an explanation for why things happened the way they did which doesn’t involve people holding ideological differences than what’s found among the elite.
That's interesting to analyze and everything, but it's surely inconsequential compared to what real people say and all their misinformation, rhetoric and general bias. What's the difference between a popular influencer with a public persona spreading politically biased stories and an faceless account doing the same thing?
It sounds like they're looking for someone to blame for right wing candidates' success in the same way people blamed Russian bots for Trump's success because they can't believe that real people could possibly be voting for such obviously "wrong" candidates of their own free will and they must surely have been fooled into it by clever tricksters.
Didn't the FBI and the Mueller report pretty thoroughly lay out the massive scope of Russian social media influence on the 2016 election and beyond? I thought this was a decided issue but maybe I'm wrong.
> It sounds like they're looking for someone to blame for right wing candidates' success in the same way people blamed Russian bots for Trump's success because they can't believe that real people could possibly be voting for such obviously "wrong" candidates of their own free will and they must surely have been fooled into it by clever tricksters.
Since the media consumed by EU voters is so disjoint because of language barriers, this should be somewhat possible to test. If one could show a correlation between the amount of "trickery" present in media consumed by voters in a country and that country's election outcomes, you could get an answer to whether voters really are fooled into voting for right wing candidates.
The difference is that those faceless accounts are in many occasions coming from foreign entities. Some of whom have no motivation other than to sow discord and cause chaos.
And nobody is attributing right wing success purely to bots. That's just some strawman that you invented. The issue is that is a significant factor and is causing polarisation and distrust in institutions. And you need those instutitions to work in order to solve the hardest problems e.g. mass migration, ageing population, climate change etc.
racist prejudice displayed in this article is nauseating. seems users from certain asian countries showing interest in european politics automatically become members and proof of disinformation campiagn. shame!
Interesting how bad twitter is at banning fake accounts. However, 200 russian bot accounts are not responsible for the populist shift in european politics, which is what a lot of people want to believe.
[+] [-] marcus_holmes|6 years ago|reply
As others have said, it's incredibly unlikely that a tweet is going to change your mind about anything.
Having worked both in and out of the journalism industry, I know that journalists get a lot more worked up about what people say online than everyone else does. They think it matters a lot more than it does (probably why they got into the job of saying things online in the first place). Twitter is completely irrelevant to most people. They're vaguely aware it exists, but it really doesn't matter.
In TFA, you can see each tweet getting a few hundred to a few thousand likes/retweets, and most of those identified by the analysts as bots. So each tweet is maybe reaching an audience of a couple hundred humans, at best (and it'll be the same few hundred humans for all such tweets). How many of those humans are going to change their vote because of that tweet?
It's irrelevant.
But it is dangerous. Politicians get very nervous about elections. If the politicians are persuaded that any of this matters, then they'll be more inclined to stop it mattering. And that means laws that curb free speech online, monitor communication, prevent encryption, and all the rest of the shitstorm we're facing.
[+] [-] rafaelm|6 years ago|reply
In my country I've been seeing a constant campaign of subtle disinformation against opposition figures that is very powerful. You check the "user's" Twitter profile and they'll have a bunch of pro-opposition opinions on it. Then, the "user" will start to express disillusionment about certain people, asking them not to betray "our side" etc. By the end, it's a full blown attack against "traitors".
This kind of stuff is very subtle and powerful. I've seen it over and over and people will get dragged into believing this stuff and it's very damaging.
It's almost stuff straight out of the movie "Inception". You just need to plant that "seed" of doubt and it will grow in a lot people. It might just be the nature of our political climate, so it might not work everywhere though.
[+] [-] colanderman|6 years ago|reply
You are right that normal people don't read Twitter.
But Twitter users read Twitter. And there are many of them. And they amplify what each other says, conspiracy theories, telephone game, and intentional disinformation alike.
And… eventually some of that disinformation reaches critical mass and graduates from the disinformation Petri dish of twitter.com and releases spores into the mindspace of journalists, "journalists", and normal people who happen to be on Twitter sometimes.
And then, having evolved into a truth-resistant strain, that disinformation spreads through Facebook, InfoWars, well-meaning television news outlets, and television news outlets owned by Sinclair Broadcast Group, into memorable one-liners that reinforce one's own beliefs just enough to avoid application of even the hint of rational thought, which get tucked away in that special space of the mind used to store those juicy "gotchas" you're saving up for next Thanksgiving.
And come Thanksgiving, you use them – take that, Hillary-loving millennial nephew! And, you find out they're wrong, because of course they're wrong, they're ridiculous – why would Hillary Clinton be performing satanic rituals underneath a pizza place? – but it doesn't matter. You've spent the past year reinforcing your beliefs because you half-remembered that you heard somewhere a vague description that Hillary did this awful thing, and you told your friends and it reinforced their beliefs, because even if it made no sense to you, it was vague enough to make sense to them, and, well, she probably did something else terrible anyway, and in fact, your just buddy told you about how Wall Street is paying her to get rid of US borders. Sounds about right!
[+] [-] mattigames|6 years ago|reply
[+] [-] gonvaled|6 years ago|reply
[+] [-] rbanffy|6 years ago|reply
Yes, but multiply that by the number of targets, then by the number of messages, and then by exposures and you'll see that very small chance turn into a massive tide. That's how advertising works.
[+] [-] kadendogthing|6 years ago|reply
We have pretty significant evidence over the last decade indicating otherwise. The RAND institute did an entire paper about it.
>If the politicians are persuaded that any of this matters, then they'll be more inclined to stop it mattering. And that means laws that curb free speech online
Pretty sure misinforming the voting public matters? Uneditorialized journalism and publishing created this issue. Don't try to pretend any law addressing this problem is somehow constraining the free speech of the public.
[+] [-] toofy|6 years ago|reply
In the US 2018 congressional races many of them were decided by an incredibly small numbers of voters, a tremendous amount were decided by less than 5% and quite a few were decided by margins of a fraction of a percent.
I'm literally in a car leaving for the airport so I can't verify the exact numbers, but I'm confident I'm close enough[1]: Kansas' 2nd district was decided by like .87%; Florida's US Senate Rick Scott won by .12%; Georgia 7th was .15%; Minnesota 1st was for sure .45%; Illinois' 13th district was something like .77%; New York's 27th district was like .28%. In 2016 presidential race, New Hampshire was decided by like 1500 votes. That's just a sample of the extremely close races across the country.
Now, while those races might seem concerning, it gets even more concerning when we consider smaller districts, county elections, and city elections where it isn't uncommon for seats to be decided by margins of 25 people or less.
The worrying thing about disinformation campaigns isn't that the disinformation is going to suddenly convert 90% of our population into propaganda vessels, it's our modern ability to target smaller and smaller demographics in key locations with information in ways that we previously were unable to do in cost effective ways. The concern (and it is a concern) is that the disinformation and often intentional outright lies will misinform just tiny tiny fraction of a percent of vulnerable logic challenged people in key areas where the ripple effects punch outside their weight.
And also of course how easy it is to swing local county and city elections, considering that a rather high ratio of local voters tend to be elderly people who may not have their skeptical hats on when the magic box tells them the opposing party is eating babies or whatever.
If you've spent any time having long casual conversations with many of our voters, you'd find out that many of these people have been convinced of and believe in truly outlandish things, they want to believe in fantastic explanations rather than what is usually a mundane explanation. We don't have to look far to find outlandish ideas floating around our population--just consider anti-vax and how it spread mostly organically, it was not a highly funded and highly targeted campaign.
Our people can make rational decisions and vote for their interests, provided they are given solid information and are not exposed to outright disinformation. It doesn't matter which direction people vote, as long as their decisions are not based on disinformation spread intentionally.
[1] I'll verify those numbers once I'm settled in, but I am confident they're within a reasonable accuracy for this topic. If you're still not convinced of the larger implications here, I'd be happy to list more. There were a whole shitload of key counties where elections were decided by margins of less than 10 people and quite a few were less than 5.
[+] [-] shaki-dora|6 years ago|reply
[+] [-] blunte|6 years ago|reply
I personally know people that, when provided clear evidence of entirely false and manipulative stories will justify them by saying, “well, this story may not be true, but there are a lot of stories like this that you haven’t seen which are true”.
The point is, they believe what they want to believe, and no amount of proof to the contrary will change their mind. In fact, they often just double down on their arguments.
[+] [-] lopmotr|6 years ago|reply
[+] [-] badestrand|6 years ago|reply
All people are like that, just everyone for different topics. I heard so many people say "There are not enough resources on earth so that everyone can have a mobile phone" or the like, which obviously is false. Don't get me started on all the beliefs of lefts and rights in the refugee crisis in Germany. Or take "the free market solves it all" from many liberals, "capitalism is evil" from young contrarians and whatever conservative thoughts from older folks.
My point is, the problem of blind belief is not specific to the audience of those tweets, it seems human nature.
[+] [-] dev_dull|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] lopmotr|6 years ago|reply
It sounds like they're looking for someone to blame for right wing candidates' success in the same way people blamed Russian bots for Trump's success because they can't believe that real people could possibly be voting for such obviously "wrong" candidates of their own free will and they must surely have been fooled into it by clever tricksters.
[+] [-] DFXLuna|6 years ago|reply
[+] [-] skummetmaelk|6 years ago|reply
Since the media consumed by EU voters is so disjoint because of language barriers, this should be somewhat possible to test. If one could show a correlation between the amount of "trickery" present in media consumed by voters in a country and that country's election outcomes, you could get an answer to whether voters really are fooled into voting for right wing candidates.
[+] [-] threeseed|6 years ago|reply
And nobody is attributing right wing success purely to bots. That's just some strawman that you invented. The issue is that is a significant factor and is causing polarisation and distrust in institutions. And you need those instutitions to work in order to solve the hardest problems e.g. mass migration, ageing population, climate change etc.
[+] [-] sittingnut|6 years ago|reply
[+] [-] Proven|6 years ago|reply
[deleted]
[+] [-] charlchi|6 years ago|reply
[deleted]
[+] [-] shaki-dora|6 years ago|reply
[+] [-] senseijoe|6 years ago|reply