Written by Martin Russell,

Discussions about Kremlin interference in the 2016 US presidential election initially focused on Russian hackers and leaked e-mails. However, US Congress enquiries have highlighted the important role played by Russian social media activity in influencing public opinion.

The Kremlin’s troll factory

a troll typing on the keyboard of a laptop computer
© Victor Moussa / Fotolia

Kremlin media, such as Sputnik and RT, weighed in on the election debate, showing a clear pro-Trump/anti-Clinton bias. However, most Russian activity was covert, through social media accounts purporting to come from US citizens, news portals and organisations, but in fact operated by Russian trolls. According to statements by social media companies to US Congress inquiries into the subject, there were 50 258 Russian-linked, automated Twitter accounts, which had generated 2.1 million tweets; Facebook found 470 accounts (22 of these matched directly with Twitter accounts), which produced a total of 80 000 posts. On YouTube, Google suspects 18 channels, which between them uploaded 43 hours of political videos.

Social media companies have traced many of these accounts to a shadowy organisation from St Petersburg, set up in 2013 as the Internet Research Agency (IRA), dubbed by the press as the Kremlin’s ‘Troll Factory’. Officially closed in 2016, it remains active and in late 2017 moved to bigger offices. In 2015, the agency was reported to have a staff of 800-900. Of these, as many as 90 were assigned to cover the US presidential election. Although agency head and Kremlin caterer Yevgeny Prigozhin is known as Putin’s ‘personal chef‘, the organisation has no official links to the Kremlin, which continues to deny any electoral meddling.

Stirring up controversy through provocative posts and street protests

Tweets from Russia-linked Twitter accounts included re-tweets of posts by Donald Trump (470 000 times), as well as re-tweets of posts from Wikileaks and related accounts about leaked e-mails from the Clinton campaign (197 000). Nevertheless, most tweets were not directly linked to the election campaign, but promoted right-wing views on race relations, Muslim extremism, migration and gun control likely to appeal to Trump supporters. Among the most successful troll accounts on Twitter were those of Trump supporter Jenna Abrams (over 70 000 followers) and of Islamophobe SouthLone Star (17 000 followers), self-described as a ‘proud Texan and American patriot’. Kremlin trolls also set up numerous discussion groups on Facebook, Twitter and Instagram, such as Heart of Texas, Being Patriotic and Army of Jesus.

However, some trolls also took the opposite side of the debate, in groups such as Blacktivist and BlackMattersUS, LGBT United, and United Muslims of America. The aim, in the words of US Senator Richard Burr, appears to have been ‘to foment conflict… and tear apart our society’. Presumably, in doing so they calculated that a more heated debate would favour Donald Trump as the more outspoken candidate.

Kremlin trolls brought conflict not only to social media but also to the streets, by orchestrating various protests. In May 2016, an anti-Muslim rally in Houston organised by Heart of Texas clashed with a counter-protest led by United Muslims of America. Two months later, a Blue Lives Matter rally honoured officers killed by a black protestor, on the same day as a gathering commemorated a black man shot dead by police. Being Patriotic attempted to organise pro-Trump rallies in 17 cities on 20 August 2017; how many of these actually went ahead is not clear. More successful was a November 2016 BlackMattersUS anti-Trump rally, staged a few days after Donald Trump’s election victory in New York and attended by thousands. These events were mostly organised from Russia without any local presence; for some of its rallies, BlackMattersUS posed as a black rights movement in order to enlist the support of American activists.

Amplifying the message

On Twitter, internet analysts have described a three-step propagation technique in which messages are launched by fake ‘shepherd’ accounts purporting to come from influential organisations and individuals. One example was @TEN_GOP, claiming to represent the Republican Party in Tennessee. In the second step of the process, ‘sheepdog’ troll accounts re-tweet messages, adding content of their own; finally, these in turn are disseminated by thousands of automated ‘bot’ accounts. As tweets spread across the internet, they attract interest from genuine users, acquiring a momentum of their own and eventually going viral.

The use of social media advertising

On Facebook, Kremlin trolls paid a total US$100 000 for 3 000 of their posts to appear in users’ news feeds as ‘sponsored content’. One advertisement featured a picture which users were invited to like in order to help Jesus defeat Satan (and with him, Hillary Clinton) in an arm-wrestling match. Other examples include advertising memes (one showing Texas rangers waiting to intercept illegal immigrants, captioned ‘Always guided by God’) and over 1 000 YouTube videos. Facebook advertising of this kind is highly effective, as it can be targeted at a particular audience based on user data. Facebook admits that one quarter of Russian ads were geographically tailored. Corporate insiders claim that swing states such as Michigan and Wisconsin, both won by Donald Trump by just a few thousand votes, were among the targeted locations.

What is the impact of Russian social media activity?

After initial scepticism, social media companies have woken up to the scale of the problem. Data released by them show that the most successful troll groups, such as Blacktivist (388 000 followers) on Facebook and Heart of Texas (254 000) on Twitter, reached substantial audiences. @TEN_GOP had 115 000 followers and attracted comments from such high-profile individuals as former National Security Advisor, Michael Flynn.

On Facebook, trolls reached 126 million users between January 2015 and August 2017: 11.4 million people as a result of seeing advertisements, 29 million people from their news feeds, and the remaining 88 million from shared posts. On Twitter, election-related tweets from Russia reached users’ newsfeeds 455 million times. 1.4 million Twitter users interacted with such tweets, for example by re-tweeting, quoting or liking them.

Although these are very large numbers, their implications should not be exaggerated. On Facebook, Russia-produced messages represented only 0.004 % of users’ news feeds; most users are likely to have simply overlooked them. On Twitter, Russian accounts only represented 1 % of election-related tweets.

Countering troll activity

Social media companies’ response

US senators have criticised social media companies for not taking the problem seriously enough. For example, Facebook accepted payments in roubles from purportedly American groups registered at Russian addresses or with Russian phone numbers, even though the company’s own rules require advertisers to be authentic. Twitter only closed down the popular @TEN_GOP account 11 months after it was exposed as fake.

However, social media companies are finally acting, in cooperation with the FBI’s Foreign Influence task force set up in 2017. As well as closing down suspect accounts, Facebook has committed to recruiting 1 000 extra staff for its advertising review department, and it has changed the ‘trending topics’ algorithm, making it harder for fake news to find its way into news feeds. The company does not block disputed content, as it is unwilling to act as an ‘arbiter of the truth’; however, since December 2017 stories which have been challenged as fakes are displayed next to alternative versions of the facts in the ‘Related Articles’ section. Facebook has also cut the amount of news stories that users see in their feeds. For its part, Twitter has promised to make election-related advertising recognisable to users, and to publish data on advertisers and targeted groups.

US administration/Congress response

President Donald Trump remains defiantly dismissive of the role played by Kremlin trolls. However, the US Congress is taking the problem more seriously; there are no fewer than three separate ongoing investigations into Russian meddling in the US 2016 presidential election, by the Senate and House Intelligence Committees, and the Senate Judiciary Committee. In late 2017, each of the three enquiries held hearings focusing on social media aspects, at which Google, Facebook and Twitter representatives were questioned.

The US Congress is debating a bipartisan Honest Ads Act, which would apply the same transparency standards to political advertising on social media advertising as already exist on broadcast and print media. However, its success is not guaranteed, not least due to possible resistance to restrictions on electoral campaigns.

Even if social media companies found no evidence of Kremlin meddling in the 2016 US elections, there is no reason for complacency; Russian trolls are accused of being behind a successful January 2018 Twitter drive for the release of a memo that sets out to discredit the ongoing investigation into collusion between Russia and the Trump campaign, while CIA Director Mike Pompeo is warning of Russian interference in the 2018 elections.

Download this At a glance note on ‘Kremlin trolls in the US presidential election’ in PDF.