Spam Comments Flood Serbian President’s Facebook Post

Suspicious accounts attacked the Serbian President’s official Facebook page with a pro-Kosovo Army hashtag

(Source: @KaranKanishk/DFRLab via Flickr)

In December 2018, a series of accounts launched a coordinated spam attack on Serbian President Aleksandar Vučić’s official Facebook page. The accounts targeted a post by Vučić in which he expressed his opposition to Kosovo’s decision to form its own army independent of Serbia.

The spam attack hijacked the online discussion regarding a sensitive policy issue: Kosovo’s decision to establish an army independent of Serbia. Since Kosovo declared independence from Serbia in 2008, relations between the two governments have been tense.

Since Kosovo’s declaration of independence, Kosovo has been governed by NATO standards for its military force, including a requirement that Kosovo’s military be trained by NATO. But after the Kosovar Parliament decided to proceed with the amendments to its constitution to establish an independent army, the hashtag #KosovoArmy surged in popularity online.

The attack demonstrated how social networking platforms can serve as virtual arenas for geopolitical conflict, especially with regard to sensitive policy issues.

Spammers or Trolls?

It is worth distinguishing the spamming behavior of the accounts in question from trolling. Spamming and trolling share some characteristics, with the most notable being that they both aim to disrupt or derail online conversation. Spamming differs from trolling, however, in its sheer volume. It often involves hijacking ongoing discussions or topics by posting a deluge of similar — and sometimes identical — posts from either a single account or multiple accounts. Spam comments also do not necessarily relate to the topic at hand: while some further a unified narrative, others attempt instead to saturate the online discussion with nonsensical messaging.

Trolling, in contrast, is often targeted. It is specifically designed to provoke selected users by engaging them in direct discussion and presenting controversial positions. In its sheer volume and uniformity of content, the #KosovoArmy attack resembled a spam campaign rather than a troll operation.

#KosovoArmy Floods Vučić’s Facebook Post

On December 14, 2018, Vučić’s Facebook page witnessed a massive surge in comments on its posts.

The massive surge in comments on December 14, 2018, on Serbian President Vučić’s Facebook posts. (Source: @KaranKanishk/DFRLab)

The surge coincided with the official Facebook page publishing its post expressing Vučić’s opposition to an independent Kosovo Army.

A post to Vučić’s Facebook page that linked to a summary opinion of his opposition to an independent Kosovo army, translated from Serbian:Serbia will not give up its tricolor, its dignity, and pride.” (Source: vucicaleksandar/archive)

Many of the comments featured the same hashtag, #KosovoArmy, or a variation, #KosovaArmy. While the accounts published the comments within minutes of each other, this operation did not appear bot or app-driven, as many of the comments had minor inconsistencies in text. In contrast, when a bot or other application drives coordinated inauthentic posting, the posts tend to be identical. Nevertheless, the high number of likes on the comments and the similarities in content suggested a degree of coordination.

#KosovoArmy spam comments on Serbian President’s post. (Source: Facebook)

Many of the accounts using the hashtag had their location set to Kosovo in their profiles.

Profiles with Kosovo set as their location. (Source: Facebook)

Furthermore, many of the posts included emojis of the Albanian and U.S. flags; both countries have expressed support for Kosovo’s decision to establish its own army. The spam comments may have originated from sources standing in solidarity with the Kosovo government’s decision, but they also may have originated from sources hoping to inflame tensions between Kosovo and Serbia. The DFRLab considers the comments to be spam, in part, because they lacked a target, an essential component of trolling behavior.

Signs of Coordination: Liking Patterns

Some of the Facebook pages that liked the comments displayed signs of coordinated inauthentic activity. For example, all of the top “Most Relevant” comments had likes from the same network of otherwise inactive Facebook pages.

Network of Facebook pages liking comments of pro-military Kosovar accounts. Color indicates same pages that liked different comments. (Source: @KaranKanishk/DFRLab via Facebook)

The Facebook pages showing this behavior had been inactive for about a year. That they had liked the same comments, often in the same order, however, suggested a degree of coordination.

Pages that liked different posts related to the Kosovo Army. (Source: FinnishEmbassyPristina/archive)

The sheer number of comments on Vučić’s post demonstrated that, in large volumes, coordinated attempts can saturate discussions on official government communications. In doing so, these attempts bury authentic debate and the exchange of ideas among members of the public.


Kanishk Karan is a Research Assistant with the Digital Forensic Research Lab.

Follow along for more in-depth analysis from our #DigitalSherlocks.