Read this in German here.
In this month's research brief, we turn to the bogeyman of elections in recent years: the coordinated spread of misleading content, often called disinformation or misinformation. We dive into some examples and show the bad, but mostly the good of how such content is spreading in the German election of 2021.
Key takeaways
- The extensive sharing of YouTube videos on Facebook, which we touched upon in our first research brief, is seen also here when looking at coordinated sharing.
- Many Facebook groups we examined share administrators and also members, yet share the same content. This allows the members to like the same content in multiple places, exaggerating its popularity and engagement rates. The user accounts may not be inauthentic, but the popularity of the content is.
- There have been attempts on social media to push election manipulation narratives. Neither of the two we examined have succeeded. However, the repeated usage of this tactic calls for us to remain vigilant.
- While we have not seen a viral piece of manipulative content, the amount of problematic content does raise the worry of ‘death by a thousand cuts’: a weariness and confusion toward election news.
A look back
The SPD emerges from under the radar
Remember our first research brief when we said Olaf Scholz and the SPD were not seen as a threat as there was little negative campaigning against them, as seen here?
It seems based on the current polls that being off everyone's radar has its benefits.
The latest polls displaying the SPD's increase in public approval compared to the 2017 election
Despite the candidate’s sobriety, Scholz’s popularity rates are rising, with users increasingly talking about him (red stands for Scholz) on social media ever since the TV truel (29 August).
Posts about the candidates on Facebook before and after the TV truel
This is an interesting turn of events, as his professionalized, staged and hence very static social media campaign free of direct interaction with the community does not differ much from Laschet’s or Baerbock’s.
Yet, opposition social media attention was focused on the mistakes and scandals of the other candidates, allowing Scholz’s own mistakes and (quite complex) political baggage to go unremarked upon. Instead, Scholz’s face is everywhere in the SPD campaign: Scholz will sort it, just as he has quietly done as a member of the governing coalition the past years...much like (as the SPD wants you to believe) current Chancellor Merkel.
Will the other parties respond and turn their sites on the SPD? While we have not yet noticed an increase in anti-SPD or anti-Scholz hashtags, we will be on the lookout for this in the coming weeks as this increasingly close race enters the home stretch.
Stories from our research process
False leads on fake accounts
When looking for suspicious sharing on Twitter, we were initially quite interested to see many tweets that were 100% matches! Surely, we thought, this must be a coordinated campaign, posting the same message from many, possibly fake accounts.
Instead, we often found users who post the same message every day...for months on end. One account posted a daily reminder that the Green party will bring about our demise, another posted several links a day about prominent past scandals involving CDU candidate Armin Laschet. These were not isolated cases: numerous accounts did the same for many of the parties.
So are these accounts bots? Surely, some resemble them, given the retweet rates, but there are many online tools to schedule and automate tweets that influencers and everyday people use.
So not all is as it seems on social media: a finding which should surprise nobody.
Deep dive
Coordination in content sharing
We have yet to see a “big lie” in the German elections this year. There has been a lot of potentially manipulative content, but these have been notable by their number, not their popularity or how far they spread. The majority of this content builds on easily observed truths and plays on stereotypes, with tiny lies and exaggerations embedded within and hard to disentangle.
These characteristics make finding ‘fake news' a hard task. Likewise, finding fake users is hard as we all have increasingly taken to social media due to the pandemic. So, in today’s dive, let's not think too much about fake versus real, bots versus humans. Most social media is a mixture of all these. Let us instead first explore social media behaviour around sharing problematic content, which will show us:
- Coordinated sharing of political content occurs in many circles: it is not just bad actors.
- Many mid-sized AfD Facebook groups share administrators and also members – and share the same content. This raises some concerns about authenticity but is not in itself manipulative.
- However, the mere number of online groups supporting the AfD lends more advantages to coordination. It allows them to boost a piece of content to make it seem more popular or boost engagement rates on AfD affiliated groups on social media.
- Despite this power, attempts to push more controversial narratives have not moved outside the AfD’s network of supporters.
It takes a village: Comment, like, share
A primary mode of spreading one's message is to coordinate with others to share content at around the same time. This push catches the attention of algorithms, which then can further amplify the message by sharing the content more broadly, 'gaming' the algorithm so to speak.
To get a message to spread well, actors in a network should have ties to a diverse group of people, enabling them to reach larger and novel networks. Here, you can see an example of a ‘normal’ sharing network of a popular video, in this case a CDU-critical video about the climate crisis recently posted by German Youtuber Rezo. Rezo’s diverse reach activated several unique sharing networks, some of which then reached into other network clusters.
The birth of a viral video: Shares within a four minute window
Let us look also at a website with opinions from the right-wing of the political spectrum: reitschuster.de. Here, we also see a loose sharing network linked to two tighter networks. Looking further at who is sharing, we see the yellow cluster are AfD groups, while smattered throughout we also see pro-Orban Hungarian groups and anti-Corona measure groups. Despite Reitschuster’s claim that they are “without ideology”, which could mislead new visitors to the site, its fanbase has a quite clear ideological leaning.
Reitschuster article shares
If an actor lacks a diverse network, then the content will remain in the group's echo chamber. For instance, in the next image, the seven most viewed videos about the AfD - some positive, some negative - are being shared in a coordinated fashion. The AfD bubble is particularly immune to other groups coming in to share; however, the other bubble does have some AfD presence as the videos shared here include documentary-style content which both AfD and non-AfD persons would find of interest.
Bubbles!
Bubbles can be broken though: through a group‘s followers, its content can be pushed into larger circles via other forms of engagement than sharing - likes, comments, etc.
If a high ratio of people engage with a message from within a certain group of users, higher than would be expected (what Facebook calls ‘overperforming’), the algorithms will think that this content could also resonate with other people and show it to more users. If the message does indeed resonate with these people, even if only by causing outrage leading to a reaction, the algorithm thinks it was right, it spreads the message further, setting in motion a snowball effect.
Three different types of coordinated behaviour
- Automated bots that are programmed to share, re-tweet, or even engage with content (e.g., like, comment)
- Political organizations and persons that are linked together and jointly decide in advance to push some content
- Active political organizations and persons who are passionately engaged in an issue and are quickly alerted to new content within their area of interest, which acts as a signal to engage with the content.This uncoordinated coordination is sometimes called Stigmergy: the coordination of multiple actors through signals in their shared environment.
From theory to practise: Sharing the anti-Baerbock message
In our last research brief, we looked at comments on YouTube videos returned in searches for different candidates. As we saw in issue one, YouTube videos are often shared on Facebook. So let us look at some of these sharing patterns on Facebook. Specifically at sharing within Facebook groups and pages and whether there is some coordination in the efforts to push some content.
Baerbock is the favourite target of many social media actors, so let’s focus on her. We identified 16 YouTube videos within the top 100 most viewed YouTube videos which target Baerbock to varying degrees, from simply questioning her political experience and career, to more conspiratorial videos on her behaviour.
A key indicator of coordination is sharing a video link within a short period of time: typically 4 minutes or less. Tracing back how the anti-Baerbock videos were shared among Facebook users in different clusters, some patterns become visible.
Sharing of anti-Baerbock videos
Each coloured cluster is sharing the videos within a small time window. In the main blue cluster with the tightest sharing, one can see by the group’s names that they are likely closely aligned: ‘AfD-Stammtisch’, via ‘AfD-FanCLUB’ to ‘Unterstützer der AFD Nr. X’ (followed by different numbers).
The visual also shows the gateways to new clusters that were mentioned above, reaching violet and yellow dots that form their own networks to share the content. If we were to extend the sharing time to 30 minutes, and then several hours, we would see these smaller clusters grow and then connect to yet other clusters.
The fact that some of the group names share similar names aside from a number (e.g., ‘Unterstützer der AFD Nr. 3’, in English ‘Supporters of the AfD No. 3’) warrants a further look. Examining the groups’ structure reveals they share several of the same administrators, including accounts, which, while having human names, are devoted to political context, and persons identified (whether truly or not) as working for the AfD. Not really a grassroots initiative then. Even the other accounts with more name variation, such as ‘AfD-Treffpunkt 🖤❤️💛’ , ‘AfD-Stammtisch’ and ‘AfD 51% - das ist unser Ziel !!!’, share these same administrators.
We cannot establish if these accounts are real people, but we must give the benefit of the doubt. However, while the people may not be inauthentic, the behaviour could be (per Facebook itself) and the popularity of the content they are promoting certainly is.
Each of these groups has about an average of a thousand to several thousand members. However, many persons are members of several of the groups. When the same content is posted in each group then, as is happening, the individual can ‘like’ the content more than once. One-thousand people in 10 groups can like a shared video 10,000 times.
The community spreading anti-Baerbock videos thus seems driven by the further right-wing of German politics, especially by AfD supporters. This is consistent with our findings from our first research brief, which showed that AfD supporters are also driving anti-Green hashtags on Twitter.
While some of these videos on the list above appear outright crazy and obviously false, if the messages were disguised better, coming across as legitimate news (as reitschuster.de tries to do), pieces of misinformation could easily infiltrate larger parts of the electorate. This is at its most dangerous when it interacts with traditional media.
A failed false narrative push
Indeed we identified a failed attempt to do just this: push and spin a mainstream news story to spread a false narrative. While doing our YouTube sharing research, we also were examining the messages posted in Facebook groups for similarities. Unsurprisingly, Alice Weidel’s comments are copied and/or shared most often – surpassed in frequency only by financial spam scam that haunts all public Facebook groups.
This led us to one post from 9 July that was reposted particularly often. It alleges that Chancellor Angela Merkel’s hosting of the judges of the constitutional court for dinner was interfering with the AfD’s lawsuit against her for alleged illegal interference in the 2020 Thuringia state elections. The post links to a Bild article (“Merkel lädt Verfassungsrichter ins Kanzleramt”).
When we look at sharing beyond Weidel’s accounts, we see the article did indeed reach even further and was shared quite closely together by many AfD supporting and/or affiliated Facebook groups and pages – including some of the aforementioned accounts with shared administrators.
AfD-only sharing of election interference content
Let's take a look at the sharing behaviour after one minute.
This sharing network is unlike most we see. Even when we increase the time between sharing, which means less coordination but some shared interests and network effects, the shape of this network changes little. The tight-knit AfD sharing circle was never joined by other clusters.
As we reported previously, the attempt by some AfD supporters to push the election fraud narrative for the Saxony-Anhalt state election in June this year largely never made it beyond the circle of groups and persons already ready to believe this tale. This ‘unrelated’ push of a similar narrative, albeit about interference rather than fraud, one month later should be seen as not unrelated at all.
While both were unsuccessful, we should remain vigilant for further attempts when the next vote happens in less than three weeks.