As the European Commission is working on the new European Democracy Action Plan (EDAP) and the Digital Service Act (DSA), DRI gathered experts from civil society, academia, and policymakers on 29 September to discuss Germany’s experience in regulating technology companies. DRI's Helena Schwertheim shares some of the highlights from the conversation, held under the Chatham House Rule as part of a project financed by the German Federal Foreign Office.
The EDAP and the DSA will update how online platforms are regulated in the EU. They will tackle topics such as online disinformation, hate speech, illegal content, online political financing, and the transparency and accountability of tech platforms.
While all eyes are on Brussels, some EU member states such as Germany have already experimented with platform regulation.
Lessons learnt from Germany’s hate speech regulation
Since coming into force in 2017, the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, or NetzDG) has proven that platforms can indeed be held accountable. German policymakers saw that platforms were willing to apply the law. This is a positive finding, given that Brussel’s ambitious DSA and EDAP packages will likely ask online platforms to comply with new laws and regulations. Another lesson from Germany’s NetzDG experience is that more can be done. The 2017 regulation is already outdated: notice-and-takedown regimes are limited in efficacy and efficiency, and new technologies exist to move beyond this.
In addition, the NetzDG empowers the end-user of platforms, citizens themselves, in myriad ways. Under this regulation, users hold online platforms accountable via user-friendly reporting mechanisms. The NetzDG also empowers victims of hate speech by entitling them to information and data from platforms on their case, to refer it to courts and to identify the perpetrator. Another way in which users are empowered is via the law’s requirement for the legal representation of each online platform in Germany itself, not in another distant or inaccessible country. Given this success, any meaningful regulation should consider putting people, or users, at its centre.
The NetzDG is a law that has had some success in Germany, a country where checks and balances protect freedom of expression. However, it has created a momentum for “copycat” laws in other states with limited rule of law. In these countries, such laws instead threaten to shrink public spaces online and offline, undermining democratic principles and rights. To avoid this, a European countermodel should be considered. This model should ensure platform transparency, safeguards for freedom of expression (such as complaint systems), encourage diversity and user choice, and promote more reliable information from authoritative sources.
Other critiques of NetzDG should also be drawn upon by Brussels and policymakers elsewhere. While platforms report no occurrence of over-blocking of non-harmful content under this regulation, this is self-reported data which lacks independent scrutiny to confirm whether this really is the case. Adding to this, German national law has no regulative priority over the platforms’ community standards, which themselves are enforced rather arbitrarily. The requirement for even and more predictable enforcement of such standards should be considered by the EU.
Other lessons learnt: transparency and access to data
An overarching theme of the workshop’s discussion was meaningful transparency and access to online platforms’ data for public interest research. Experts on disinformation, online political advertising, and algorithmic transparency shared recommendations on what this could mean:
Regulating disinformation: platform curation decisions (ranking some content higher than others or recommending certain content) are not a direct infringement of freedom of speech but they impact the availability and quality of information for people. However, to meaningfully discuss content curation, an existing information asymmetry must be overcome. Platforms hold data on their users and society, but public interest researchers have no access to this information. This not only makes independent research to inform regulation impossible but also impedes regulation enforcement and the sanctioning of non-compliance. Algorithm audits need to be put in place to address this core problem of online discourse.
Likewise, online political advertising needs more public interest scrutiny. Self-regulation has not worked in this area. Despite some efforts to set up advertising archives and libraries, the available information remains incomplete, differs from country to country and is not user-friendly. This undermines counter speech and the collective accountability seen in offline political advertising. Regulation should make advertising archives mandatory and prescribe the required level of detail and user-friendliness. Disclaimers on political advertisements online should be expanded and become mandatory across platforms. Likewise, parties and candidates posting ads should be obliged to publish all paid advertising on their websites including information on targeting. An independent body should have oversight over these actions.
Lastly, when it comes to providing data access for research, platforms do provide limited access to some selected researchers, but it is a time-intensive process. In some cases, once acquired, it is impossible to verify the accuracy of the data, and the explanatory power of the data is limited, given that some parameters are not available. A more robust data access framework is needed, with clear steps and requirements for access.
We discussed these findings with a high-level official from the European Commission. The two Acts are expected to be unveiled on 2 December.
We thank our panellists from the German Foreign Federal Office, the German Federal Ministry for Justice and Consumer Protection, the Jacques Delores Centre, the Free University of Berlin, the Stiftung Neue Verantwortung, AlgorithmWatch, and Reporters Without Borders for their contributions.