#TrollTracker: Russian Traces in Facebook Takedown

Part Three — How the trolls tried to cover their tracks

On July 31, Facebook announced the removal of around 32 pages and accounts on its platform for coordinated and inauthentic behavior.

Facebook first shared eight pages with @DFRLab 24 hours before the takedown, and our initial findings were published within that timeframe.

The pattern of behavior by the accounts and on the pages in question make one thing abundantly clear: they sought to promote divisions and set Americans against one another. Their approach, tactics, language, and content were, in some instances, very similar to accounts run by the Russian “troll farm” or Internet Research Agency between 2014 and 2017.

The malign influence operation showed increasing sophistication. Three follow-up aspects to our initial findings include converting online engagement to real world action, shifting tactics to cover tracks, and crossover posting of content from bad actors on different platforms or accounts.

@DFRLab intends to make every aspect of our research broadly available. The effort is part of our #ElectionWatch work and a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.

This post investigates how the troll accounts tried to hide their origins, and compares them with the earlier behavior of the troll farm, and other known trolls and malicious actors online.

Those masking techniques included technical measures, such as the systematic use of proxy browsers; real-world measures, such as presenting excuses for not showing up to rallies which they organized; and adjustments to content, such as a very low proportion of original, authored posts.


Technical Measures

Facebook’s statement on the removal of the “inauthentic” accounts pointed out the technical steps the accounts had taken to hide their origins.

It’s clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency (IRA) has in the past. (…) Security is not something that’s ever done. We face determined, well-funded adversaries who will never give up and are constantly changing tactics. It’s an arms race and we need to constantly improve too.

(…)

These bad actors have been more careful to cover their tracks, in part due to the actions we’ve taken to prevent abuse over the past year. For example they used VPNs and internet phone services, and paid third parties to run ads on their behalf.

The company also hinted at a further measure: separating different accounts and pages from one another, to make it harder for researchers to track from one to the next. This tactic was revealed by the one time it apparently failed, when a known Russian troll account was used to administer one of the pages, then withdrawn just seven minutes later — suggesting that an operator had realized their error and hurried to undo it.

The mistake was enough to lead Facebook to the cluster of accounts.

One of the IRA accounts we disabled in 2017 shared a Facebook Event hosted by the “Resisters” Page. This Page also previously had an IRA account as one of its admins for only seven minutes. These discoveries helped us uncover the other inauthentic accounts we disabled today.

These steps were a definite improvement on the earlier behavior of the troll farm. One of the clearest indicators that it attempted to interfere in the 2016 U.S. election was the fact that some of its accounts purchased political ads on American issues which were paid for in Russian roubles.

Excerpt from the U.S. indictment of the troll farm’s operatives, detailing the purchase of political ads using Russian accounts and credit cards. (Source: U.S. Department of Justice)

The earlier troll efforts also used the same accounts to open different, apparently unrelated pages, and ran multiple accounts from the same computers.

Excerpt from the U.S. indictment of conspirators behind the hacking of Democratic Party emails in 2016, showing the crossover between accounts and computers. (Source: U.S. Department of Justice)

Another giveaway was that some troll-farm accounts were registered using Russian phone numbers. One of the most notorious, Twitter troll @TEN_GOP, was exposed in this way in 2017, when online researcher @AltCyberCommand attempted to log into the account, and captured a screenshot of the verification message from Twitter showing that @TEN_GOP’s recovery phone number began with Russian dialing code +7.

https://twitter.com/AltCyberCommand/status/858961467793526784

The use of virtual private networks (VPNs) — proxy browsers which conceal the user’s original internet address — was not new. According to the U.S. Department of Justice, which indicted leading members of the troll farm in January 2018, the perpetrators:

(…) purchased space on computer servers located inside the United States in order to set up virtual private networks (“VPNs”). Defendants and their co-conspirators connected from Russia to the U.S.-based infrastructure by way of these VPNs and conducted activity inside the United States — including accessing online social media accounts, opening new accounts, and communicating with real U.S. persons — while masking the Russian origin and control of the activity.

The latest accounts which Facebook took down appeared to have been more systematic in their use, according to the platform’s statement.

While IP addresses are easy to spoof, the IRA accounts we disabled last year sometimes used Russian IP addresses. We haven’t seen those here.

This increased operational security should not be seen as especially devious or advanced. As @DFRLab previously reported, American trolls also recommend using VPNs and internet phone numbers (especially Google Voice) to circumvent Twitter bans and set up new accounts.

Conversation on Gab between @Microchip and @Slane on the utility of Google Voice, Twilio, and Skype numbers. Archived on March 24 and April 6, 2018. (Source: Gab / Microchip)

The greater security, compared with the troll farm’s earlier efforts, suggested that the latest operatives learned from the troll farm’s mistakes, whether as internal members of its team or unrelated actors who learned by watching it.

It also highlights how amateurish the original troll farm was.

Finally, it shows how blind the social media platforms were to the evidence of foreign election interference in 2016 and 2017 — and, by contrast, how much more aware they are of the dangers now.

The Dog Ate My Demonstration

Both the original troll farm and the latest accounts organized street protests in America, in an apparent attempt to steer U.S. citizens into confrontation.

One of the clues, which exposed the original troll farm’s foreign nature, was that its members failed to turn up to their own rallies. This was commented on as early as May 2016, when the troll farm promoted simultaneous rallies for and against an Islamic cultural center in Houston, Texas.

The anti-Islam rally was organized by troll Facebook page “Heart of Texas.” According to Houston Chronicle reporter John Glenn, who covered the protests, “the Heart of Texas group never showed.” The Facebook page encouraged protesters to bring weapons. Some complied.

The demonstrators themselves complained about this. According to Senator Richard Burr, a Republican of North Carolina who sits on the Senate Select Committee on Intelligence, one anti-Islam protester commented, “Heart of Texas promoted this event but we didn’t see ONE of them.”

With hindsight, this absence was seen as one of the giveaways of foreign interference. As Glenn wrote in a follow-up post once the troll farm was exposed, “I couldn’t find the rally organizers. No ‘Heart of Texas.’ I thought that was odd, and mentioned it in the article: What kind of group is a no-show at its own event? Now I know why. Apparently, the rally’s organizers were in Saint Petersburg, Russia, at the time.”

Perhaps in response, the latest Facebook pages to be removed presented a series of excuses to explain their absence from events which they had organized. Some were at least plausible, such as this one, which claimed a sudden illness on the eve of a protest.

Post by inauthentic Facebook page “Resisters” on June 30, 2017, ahead of a July 1 rally. (Source: Facebook / @resisterz)

As any schoolchild will attest, sudden illness is not the sort of thing which can be feigned too often. Other excuses were flimsy, such as this post, which claimed that the “Resisters” group would not be able to attend its own protest because its team would be at someone else’s. The responses were not complimentary.

Pre-emptive post by Resisters, which had organized a protest in honor of NFL player Colin Kaepernick for September 30, 2017. (Source: Facebook / @resisterz)

A parallel line of effort saw the group reaching out to genuine Americans on the ground, apparently to mask the organizers’ own absence.

Post by Resisters on July 27, 2017, showing the interaction with genuine Americans. We have anonymized the responses to protect the privacy of those involved. (Source: Facebook / @resisterz)

This effort culminated in an attempt to hire an on-the-ground representative whose tasks would include “hosting community events.”

Job ad posted by Resisters on July 14, 2017. Note the tasks include “hosting community events.” (Source: Facebook / @resisterz)

According to his own tweets, the Resisters page also invited U.S.-based activist Brendan Orsinger to administrate it — an invitation he accepted.

Tweet by Brendan Orsinger (@ToBeSelfEvident) on his relationship with the Resisters page. Archived on August 10, 2018. (Source: Twitter / @ToBeSelfEvident)

This approach mirrored that of the original troll farm, which progressively engaged with genuine Americans to organize election-related demonstrations, especially in Florida.

Excerpt from the troll farm indictment, detailing its interaction with local American activists who agreed to serve as “local coordinators.” (Source: U.S. Department of Justice)

These efforts were not new, but appear to have been carried out more systematically, and with a greater awareness of the danger of not showing up to the rallies which the pages organized.

As such, they represented an evolution and refinement of the technique, rather than a new departure. This is similar to the evolution shown in the use of technical masking.

Language Clues

@DFRLab previously reported that one of the key clues to earlier Russian troll efforts was the non-native use of English by accounts which claimed to be American.

Perhaps in an attempt to reduce their linguistic footprint, the latest Facebook pages to be removed posted a low proportion of original authored content, and a very high proportion of images taken from around the internet.

The page “Mindful Being,” for example, posted motivationals almost exclusively.

Typical posts by “Mindful Being”; note the absence of text. (Source: Facebook / @mindfulbeingz)

It only posted two substantial texts. Both were shares of other sources, one acknowledged, the other plagiarized from a post on a website called gaia.com in 2015.

Above: Post on numerology from website gaia.com, posted on August 4, 2015. Below: Identical wording in a post from @mindfulbeingz dated May 31, 2018. (Source: gaia.com / Facebook / @mindfulbeingz)
Left: Post by @mindfulbeingz on May 18, 2018. Right: Article on website yournewswire.com on the same date. (Source: Facebook / @mindfulbeingz / yournewswire.com)

Similarly, the page “Ancestral Wisdom” largely posted images without further comment.

Typical posts by “Ancestral Wisdom”; note the absence of text. (Source: Facebook / @theancestralwisdom)

This page only posted one substantial text. As @DFRLab previously reported, it was very largely plagiarized from a longer, undated article on website ancient-code.com.

Comparison of the articles posted by @theancestralwisdom and ancient-code.com, with identical wording highlighted. (Source: Facebook / @theancestralwisdom / ancient-code.com)

There are, of course, many reasons for posting images without comment; taken alone, this fact would be unexceptionable. However, posts by this group of pages which did contain original content also contained a high proportion of grammatical errors.

For example, the job ad posted by Resisters included two noteworthy grammatical errors and one curious turn of phrase in just 33 words.

The reSisters job ad, with the text and linguistic features highlighted in the inset. (Source: Facebook / @resisterz)

The ad said that the movement “came together on the wake” of Trump’s election; idiomatic English would write “in the wake.”

It said, “We recognize that Trump regime is illegitimate,” where the correct idiom would be “we recognize that the Trump regime is illegitimate.” Inaccuracy with the words “a” and “the” is typical of speakers of Russian and other Slavic languages, which do not use grammatical articles.

It also said, “Every effort we make is to contribute to that goal.” This is grammatically accurate, but a curious and unidiomatic turn of phrase; words such as “should” or “must” would be more characteristic of native speakers.

Similarly, this post from the Resisters page featured two apparently non-native turns of phrase in just 17 words.

Post by the reSisters page, with linguistic telltales highlighted. (Source: Facebook / @resisterz)

The post said it is American to “steal a land and then tell who can and who cannot live here.” The reference to “a land” appears to conflate the archaic term for a country with the accusation of stealing land, without the grammatical article — as seen on the T-shirt in the image. It does not read like a post written by a native speaker.

The phrase, “to tell who can and who cannot live there” confuses the words “tell” and “say.” This is a common mistake among non-native speakers of English.

@DFRLab’s analysis of further linguistic errors characteristic of earlier Russian troll operations can be read here.

As we have noted, there are many legitimate reasons for posting a high proportion of images without supporting text. However, in the context of known inauthentic accounts whose use of English betrayed their non-native origins, one reason could well be to mask those origins.

Not Missing The Dissing

One final point deserves mention, and that is the accounts’ attitude towards Russia. One of the ways in which earlier troll accounts betrayed themselves was their systematic promotion of Russian foreign policy narratives — chronicled, for example, here and here.

If — as the evidence suggests — the latest accounts were indeed run from Russia, the one comment any of them made directly about Russia appeared designed to mask the fact.

Post by reSisters on September 21, 2017. (Source: Facebook / @resisterz)

As this was the only post to mention Russia, there is insufficient evidence to draw a firm conclusion.

Conclusion

Facebook pointed out that whoever was behind these accounts had gone to “much greater lengths” than the original troll farm did to hide their origins and identity. It also underlined that such tactical changes are part of the “arms race” against online disinformation.

The troll accounts certainly took steps to limit the visible links between them, and mask their locations, IP addresses, phone numbers, and native language. All of these were weaknesses of the original troll farm.

The remedial steps were largely effective: Facebook itself acknowledged that it could not attribute the latest accounts as effectively as it had been able do with the original troll farm.

The steps were not especially clever or innovative. Browsing through proxies, registering accounts with online phone numbers, and using local third parties to organize events and payments, are all methods which have been seen before in various contexts, both Russian and American. As much as anything, their absence shows the amateurishness of the early troll factory.

Their systematic use does point to the increasing professionalism of online trolls and disinformation efforts. Regardless of where the latest trolls were based, they were clearly more skilled than earlier groups. At the same time, Facebook’s action demonstrates that they were not skilled enough.

The latest takedown underscores that the threat to Western democracies from online disinformation groups has not gone away: it has evolved. Further research and resilience will be needed as the “arms race” between information and disinformation goes on.


Ben Nimmo is Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).

@DFRLab is a non-partisan team dedicated to exposing disinformation in all its forms. Follow along for more from the #DigitalSherlocks.


DISCLOSURE: @DFRLab announced that we are partnering with Facebook to expand our #ElectionWatch program to identify, expose, and explain disinformation during elections around the world.

The effort is part of a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.

For more information click here.