Digital democracy European Union

Inauthentic Behaviour on TikTok - Concerning Accounts Supporting the AfD and Rassemblement National


A review of political discourse on TikTok suggests that the party Alternative für Deutschland (AfD) is supported by a significant number of highly active accounts that only serve to spread AfD content to millions of viewers. These accounts have no clear affiliations and do not provide indications of authenticity. No other German parliamentary party appears to have a similar ecosystem of murky accounts. We see a similar situation in relation to the French party Rassemblement National (RN). 

The proliferation and use of such accounts is happening in broad daylight. These accounts are easy to find. We are concerned that TikTok is not acting in line with its own community standards, commitments it made under the Code of Practice on Disinformation, legal obligations under the Digital Services Act (DSA) or the guidelines for the EP elections, adopted by the European Commission under the DSA.  

We recommend the company carry out an urgent review of all such accounts in France and Germany and across the EU27 member states ahead of the EP elections. We have made these findings and our recommendations available to TikTok. 


There are a significant number of TikTok accounts that support the party AfD and its candidates, but their affiliation is unclear. One cannot ascertain whether they are managed by the party directly, indirectly or by third parties. They have vague self-descriptions and sometimes use the party logo or show the AfD website. On substance, they do the same as official party and candidate accounts – spreading party content. 

These accounts include: 

Wirsinddasvolkde -  94K followers 

afd_dr.aliceweidel - 82K followers  

_alice_weidel_fanpage_ -  76K followers 

bastion.afd - 50K followers 

chaosonworld - 47K followers 

AfD.TikTok - 40K followers 

afd_bestofclips - 36K followers 

maxkrah_doktorrechts - 26K followers 

This is only a selection of bigger accounts that are clearly used for AfD campaigning. There are dozens more and similar accounts across TikTok, some of which are pure doppelgänger accounts for certain candidates. Some accounts do not even hide their character (for example, bastion.afd_backup). 

In March, TikTok restricted the account of the AfD’s lead candidate for EP elections, Maximilian Krah, stating that it had repeatedly violated community standards. There is, however, another outlet for Mr. Krah’s campaign. A doppelgänger account with 26K followers (see above). We do not find any other German parliamentary party having such an ecosystem of murky accounts.  

We see a similar situation in relation to the French Rassemblement National and Jordan Bardella, its lead candidate for the EP elections. These are some of the biggest support accounts with unclear affiliations:  

Jordanbadellaoff - 69K followers 

rassemblementnational2.0 - 53K followers  

rassemblementnational_fr - 16K followers 

Team_jordan_bardella - 14K followers 

Again, these accounts have similar features. They hardly follow any other account, and their affiliation is unclear. The account rassemblementnatonal_fr claims to be the “non-official account of the Rassemblement National presided by @jordanbardella. 


It is problematic for the authenticity of online debates when one party manages numerous other accounts that are not clearly labelled as party accounts or when accounts of unclear affiliation only serve to spread content by one party. The practice undercuts political accountability (the party can always argue “it is not our account”) and it allows for coordinated behaviour across such accounts to bring algorithmic prominence to party content. We assess that this practice violates various aspects of self-regulation and law. 

a) Self-Regulation: TikTok Community Standards 

TikTok established the category of Government, Politician and Political Party accounts (GPPP). These types of accounts cannot benefit from some functions that other users have, such as monetization and the right to post ads. This approach is not transparent because users cannot see which accounts are labelled as GPPP. Furthermore, many of the AfD- and RN-supporting accounts are unlikely to be classified as GPPP. Users may be misled to consider these to be accounts by normal users, or the reverse: to be official accounts that represent the party. TikTok thus consciously permits a significant grey zone in its policy on political content.  

TikTok’s guidelines on electoral integrity stress that the platform will fight disinformation and keep the platform available only for authentic content. Beyond elections, TikTok promises that users should be able to “feel confident that they can access information that is reliable, discover original content, and engage with people who are authentic”. 

These accounts do not conform to the standard of authenticity. Users cannot ascertain whether the accounts are managed by the AfD/RN or not. Their behaviour clearly points to systematic party support rather than authentic users, but, on the other hand, allows plausible deniability for the party. 

TikTok also promises not to allow an artificial increase in engagement on the platform. However, a network of murky accounts is an invitation for artificial engagement through cross-posting.   

b) Soft Law Commitments: Code of Practice against Disinformation 

TikTok is a signatory of the Code of Practice against Disinformation. Under the Code, the company committed itself to “adopt, reinforce and implement clear policies regarding impermissible manipulative behaviours and practices on their services” (Paragraph 14.1.). The unrestricted spread of murky accounts supporting the AfD and RN suggests that this commitment is not being fulfilled.  

c) Digital Services Act 

According to Article 34, Very Large Online Platforms (VLOPS) are obliged to carry out risk assessments of their platforms to identify “any actual or foreseeable negative effects on civic discourse and electoral processes”. Such “assessments shall also analyse whether and how the risks (...) are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service.” According to Article 35, companies must mitigate such risks. The mentioned accounts have significant followership and many post videos that garner massive viewership (millions) in the run-up to the June EP elections. The proliferation of numerous TikTok accounts with unclear affiliation, designed to spread AfD and RN content, does not suggest that TikTok assessed this risk or is mitigating it. 

The European Commission’s “Guidelines for providers of Very Large Online Platforms and Very Large Online Search Engines on the mitigation of systemic risks for electoral processes”, adopted under the DSA, stress the need for platforms to enforce rules against inauthentic accounts, whether created manually or automatically; and against problems like fake engagements, non-transparent promotion of influencers and coordination of inauthentic content creation and behaviour. Again, the network of the mentioned accounts does not appear to conform to these recommendations.  

Our recommendations

  1. TikTok should urgently apply its community standards, commitments against disinformation and legal obligations to accounts with an unclear affiliation that spread content for the AfD and the RN. It should search its platform systematically for such accounts, looking at all politicians of these parties and accounts that are geared to these parties. It should do similar research for other political parties. Our findings are based on a spot-check rather than a comprehensive analysis of all such accounts. TikTok needs to examine its own platform in greater detail.
  2. TikTok should urgently undertake similar investigations in all 27 EU member states and apply the mentioned standards and obligations to ensure that the campaigning for the EP elections is authentic and marked by integrity. We undertook brief checks that suggest that the problem is not limited to Germany and France.