Media Unintentionally Boosted Russian Disinformation Tweets: A Cautionary Tale
Many legacy media outlets played an unwitting role in the growth of the four most successful fake Twitter accounts the Russian Internet Research Agency created to spread disinformation during the 2016
written by Bert Gambini-Buffalo
"None of this was intentional. It's about operational realities," says Yini Zhang. "But with this knowledge, we can begin to address and curtail the problem of disinformation."
MANY LEGACY OUTLETS PLAYED AN UNWITTING ROLE in the growth of the four most successful fake Twitter accounts the Russian Internet Research Agency created to spread disinformation during the 2016
In roughly two years beginning in late 2015, these accounts went from obscurity to microcelebrity status, growing from about 100 to more than 100,000 followers. With its heavily populated follower base ready to spread the word—like all heavily engaged Twitter audiences—the Internet Research Agency (IRA) could strategically deploy messages and provide visible metrics, creating an illusion of authority and authenticity that often escaped the scrutiny of casual consumers and professional journalists.
Turning Heads May Boost Growth of Fake Accounts
The frantic retweets, by what the study showed to be extreme ideological enclaves, certainly fueled the accumulation of followers, but Yini Zhang, an assistant professor of communication at the University at Buffalo, says that mainstream and hyperpartisan news media also significantly amplified IRA messaging and contributed to that follower growth by unknowingly embedding IRA tweets in their content.
Zhang says there was an ideological asymmetry to the study’s results. Of the four puppet accounts in the study, @TEN_GOP and @Pamela_Moore13 posed as conservative trolls, while @Crystal1Johnson and @glod_up imitated liberals.
“We did not observe the same effect on the liberal and conservative accounts,” she says. “The two conservative accounts received a huge boost from mainstream media and hyper-conservative media quoting tweets in their news stories, but we did not see mainstream media and hyper-progressive media doing the same thing for the two liberal accounts.”
The findings, which appear in the Journal of Communication, reveal how large social media followings can often depend on a combination of the dynamics within a particular platform and the news media’s treatment of the messages emerging from those platforms. The evidence revealed in the study provides insights into the ecology of the 21st century political communication environment, suggesting that people’s tendency to seek confirmation and engage with pro-attitudinal information, as well as the media’s drive for audience attention, can work in favor of successful political disinformation actors.
In this case, constructive attempts to provide new information by integrating digital and legacy content ironically resulted in the unintended spread of disinformation, which Zhang defines as fabricated information that’s intended to cause harm in ways that benefit its agents.
“Examining how and why these accounts grew so quickly and to such astounding proportions allows us to understand the mechanisms of influence accrual in the digital era,” says Zhang, the study’s corresponding author and an expert in social media and political communication. “None of this was intentional. It’s about operational realties.
“But with this knowledge, we can begin to address and curtail the problem of disinformation.”
The research team started their work with 2,700 puppet accounts released in 2017 by the House Intelligence Committee, which received the information from Twitter. From that group, the researchers identified the four most retweeted English-speaking accounts: two conservative accounts and two liberal accounts. They collected data from Twitter about the tweets and retweets of the IRA accounts. They then searched more than 200 media outlets across the ideological spectrum to determine where the uptake of IRA tweets was occurring.
“Strong social media posts can validate content,” says Zhang. “But in their effort to turn heads, these legacy outlets were contributing to the growth of Russian sock puppet accounts.”
The processes of incorporating digital content into mainstream media makes sense, but requires careful consideration, according to Zhang.
“Social media content looks very attractive given the cost cutting realities in mainstream media and lost advertising revenue,” says Zhang. “But it also demonstrates a vulnerability within the current media economy.
“Turning heads might also mean unintentionally contributing to the growth of fake accounts, which should be subject to the same questions of credibility as any other news source: Is this account in fact what it actually claims to be?”
You are free to share this article under the Attribution 4.0 International license.
Originally published on Futurity.org, research by University at Buffalo
Original Study DOI: 10.1093/joc/jqaa042