The wrong question is being asked about Russia’s influence operations
Special Prosecutor Robert Mueller’s indictment of thirteen Russians and three Russian entities for interfering in the 2016 presidential election reignited debate over whether Russia’s main goal was to put Donald Trump in the White House.
Most of the coverage of Russian meddling involves their attempt to effect the outcome of the 2016 US election. I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal.
— Rob Goldman (@robjective) February 17, 2018
Whether Russia’s efforts were centered on one candidate is the wrong question. The more immediate and important challenge is to understand how the Russian influence operation worked, and how to stop it working again.
Much of the Russian activity during 2016 was election-focused. Thanks to many researchers, we know how Russian social media accounts and propaganda outlets boosted Trump and systematically attacked Hillary Clinton, or promoted Bernie Sanders.
The operation used a full-spectrum approach. Russian cyber hackers leaked Clinton campaign emails; Russian propaganda outlets amplified the leaks; Russian trolls amplified the propaganda outlets. Fake troll accounts engaged with real Americans in a grassroots campaign to organize pro-Trump rallies in Florida and other states.
By September 2016, the Kremlin’s pro-Trump bias was obvious. In October, Clinton campaign emails stolen by Russian hackers began leaking on a daily basis. On election day, troll accounts amplified claims of vote-rigging in Clinton’s favor — apparently to de-legitimize an expected Clinton victory.
Yet the Russian campaign was always bigger than the election. One day after Trump’s shock victory, a Facebook page run by the so-called “troll farm” in St. Petersburg, Russia, invited New Yorkers to protest against him. According to the Mueller indictment, a separate ad called for a pro-Trump rally at the same time and place.
It was not the first time the troll farm tried to mobilize Americans against one another, and it would not be the last. In May 2016, one troll farm Facebook page, “Heart of Texas,” called for a protest against an Islamic center in Houston. Another, “United Muslims of America,” called for a counter-protest.
According to images of the two events, several dozen people attended — Americans fooled by Russian operators 5,000 miles away.
Few troll posts had such an impact on the ground, but many pushed both sides of inflammatory questions. Accounts run from the troll farm praised Muslim women who wear the hijab, and accused Muslims of wanting to stone any woman who did not. They argued for, and against, LGBT groups; they posted about the Dakota Access pipeline and the Charlottesville riots; they took both sides in the “take a knee” controversy.
From the start, Russia’s operation was about widening the divisions between Americans: supporters of Trump and Bernie Sanders, LGBT groups and religious conservatives, minority and anti-immigrant groups, the police and the Black Lives Matter movement.
Others fooled American communities so completely that those communities complained when they were finally shut down. The “alt-right” proved to be particularly vulnerable: far-right accounts such as @TEN_GOP and @Jenn_Abrams amassed tens of thousands of followers and were retweeted by leading figures on the American far right, even members of the Trump campaign.
Some alt-right users continue to label the investigation into Russia’s actions as “fake news” and a “witch hunt.” This is a disturbing echo of the Russian government’s own talking points, and marks a potential channel for further Russian disinformation aimed against America. Again, this is not to say the echo is commanded or coordinated, but the loose alignment between messaging remains a vulnerability that drives Americans further apart.
Accounts which espoused Black Lives Matter were also effective. One such account, @Crystal1Johnson, scored 56,000 followers and was twice retweeted by Twitter’s founder, Jack Dorsey. According to a leaked list of troll farm accounts, the “Blacktivist” Facebook page boasted 388,000 followers; “Black Matters” counted 225,000. The troll farm’s “United Muslims of America” page numbered 330,000 followers.
Those accounts remained active throughout 2017. It was not until the fall, nine months after the election, that they were exposed and shut down.
The troll farm was a classic infiltration and agitation operation. Its accounts masqueraded as genuine Americans, used hyper-partisan rhetoric to win acceptance in the target communities, and then posed as representatives of those communities to engage with a broader public.
One account alone, @Jenn_Abrams, masqueraded as an American so successfully that it was cited by Fox News, BuzzFeed, USA Today, CNN, Sky, the Huffington Post, Daily Telegraph, New York Times, and Washington Post.
Yet not all American communities fell for Russia’s operation; the most rancorous and embittered were the most vulnerable. The greatest vulnerability of all was the hyper-partisan divide, which provided the fertile soil in which the troll farm’s seeds took root.
There is no point arguing over whether the Kremlin’s main goal was to elect Trump or divide Americans: its operation tried both. It is far more important to address the vulnerabilities which the campaign exposed — above all, the corrosive atmosphere of tribal enmity, which the Russian operators weaponized so skilfully.
Addressing that vulnerability requires political leadership: the courage to abandon the rhetoric of demonization and division, reach across the partisan divide, and accept that the greatest threat to American democracy comes from abroad, not the other party.
Ben Nimmo is Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).
Follow along for more in-depth analysis from our #DigitalSherlocks.