Digital democracy European Union
Share

The DSA now applies in full: What can we expect?

The EU’s landmark Digital Services Act (DSA) has applied to very large online platforms and search engines (VLOPs/VLOSEs), such as Google, Facebook, Instagram, TikTok and X, since August 2023 

But on 17 February 2024, the DSA’s full slate of rules started applying to all search engines, online marketplaces, niche social media platforms and video-sharing apps operating within the EU. This development comes with many changes for such service providers, their millions of users, as well as researchers and civil society organisations studying the impact of platform practices on democracy. 

What have we learned in the six months since the DSA has applied to VLOPs/VLOSEs? What changes took effect this week? And what’s on the horizon for DSA enforcement in this crucial election year? Here’s an overview.  

What have we learned over the past six months? 

The European Commission defines a VLOP/VLOSE as any online platform or search engine with over 45 million users within the EU. To date, the Commission has identified 22 such online service providers 

While some of these providers, such as three pornography sites, were only recently designated as VLOPs/VLOSEs by the European Commission, others, such as X, Facebook, TikTok, Instagram and Snapchat, already received that designation in early 2023. That means we already have some insights into these platforms’ operations, as well as how the European Commission is beginning to enforce certain provisions of the DSA. 

In October 2023, all VLOPs/VLOSEs had to submit their first reports on content moderation to the European Commission. According to an analysis of these reports conducted by the Slovak Council for Media Services: 

  • By and large, service providers treat government orders to moderate content as a top priority in efforts to moderate content, while user-generated notice and action mechanisms differ greatly between services, and appear to receive less attention. VLOPs/VLOSEs report very few government orders to act against suspected illegal content, and report having taken action on only a very small number of user-generated notices. This suggests a lack of clarity about companies’ reporting obligations, regulators’ difficulty in processing and issuing content-removal orders, and potential attempts to discourage users from submitting notices themselves 
  • Meanwhile, only about half of VLOPs/VLOSEs provided information on their detection methods for harmful or illegal content. Most content moderation decisions are automated, and many services only dedicate human resources to content moderation in a few EU languages. Platforms also appear to favour limiting the visibility of problematic content, rather than removing it altogether. This suggests that the technical processes feeding into content moderation practices remain a black box; current content moderation measures fail to effectively punish or prevent the publication of illicit or harmful content; and some EU citizens aren’t being afforded enough safeguards in their mother tongues.  

As the primary regulator of VLOPs/VLOSEs, the European Commission appears to be taking its enforcement role seriously. To date, the institution has submitted 33 formal requests for information from a range of service providers, including Meta, TikTok and Google, across a range of DSA obligations. The Commission has also opened a formal investigation against X concerning the platform’s actions against illicit content; questioning whether its blue-checkmark policy is deceptive to users; and whether the social network is acting in good faith to provide researchers with access to platform data. And the European Commission recently opened public consultations on its draft guidelines for VLOPs/VLOSEs to mitigate systemic risks to electoral processes. 

Despite this progress, the European Commission has explicitly appealed to civil society organisations and researchers to help it generate the evidence it needs to enforce the DSA as it pertains to VLOPs/VLOSEs. This is a welcome appeal, but one with a number of logistical hurdles. 

The position of civil society organisations vis-à-vis platforms is akin to that of David and Goliath: These organisations may have the expertise to hold platforms accountable, but their resources pale in comparison. There are also questions about how exactly civil society organisations can go about gathering evidence. While some social media platforms have offered researcher access to some content posted on their sites, access in many cases is difficult and incomplete. 

What changes take full effect this week? 

As the EU’s (and the world’s) largest online service providers acclimatise to their duties under the DSA, and the European Commission and civil society actors work out initial growing pains in enforcement, the full spectrum of the DSA’s rules came into force on 17 February 2024 

Now, virtually all online service providers in the EU – not just the bloc’s most prominent players – face new reporting duties, and a new regulatory and enforcement architecture will be erected to ensure consistent application of the DSA from member state level upwards.  

For their part, VLOPs/VLOSEs now have to assess the systemic risks of their services for society, take steps to mitigate these risks, and open themselves up to annual third-party audits. 

In addition to VLOPs/VLOSEs, small- and medium-sized online service providers now also fall under the DSA’s remit. They won’t face as many reporting obligations as their larger counterparts, but they will still be required to: 

  • Ensure their terms and conditions clearly explain content moderation and recommendation practices; 
  • Provide clear avenues for users to report possibly illegal content, and to contest platforms’ content moderation choices; 
  • Clearly label advertising, and make sure minors’ data isn’t being used for ad profiling.  

Perhaps most importantly, each member state must name its Digital Service Coordinator (DSC): a national regulator or agency responsible for ensuring that small- and medium-sized online service providers in their jurisdiction adhere to the DSA’s rules.  

As Julian Jaursch from Stiftung Neue Verantwortung outlines, DSCs will be crucial to DSA enforcement. Here’s a rundown of some of their most important duties: 

  • Field user complaints about online services, either by taking action themselves, in the case of services headquartered in their jurisdiction; by passing complaints to a counterpart in another member state, in the case of services headquartered outside their jurisdiction; by forwarding complaints directly to the European Commission, if they pertain to VLOPs/VLOSEs; or by tapping in another national regulator, depending on the specifics of a complaint.
  • Analyse and aggregate data on all complaints filed, providing evidence of the successes and pitfalls of DSA enforcement.
  • Collect research applications to access platform data, including those pertaining to VLOP/VLOSE data, vetting requests themselves in the case of platforms headquartered within their jurisdiction; or passing along requests to the DSC responsible for a platform headquartered within another jurisdiction.
  • Vett applications from civil-society organisations, journalists, or other watchdog organisations to become “trusted flaggers,” or those whose complaints about content on any given platform are handled in an expedited manner, as compared to other organisations, or independent users.  

What can we expect looking ahead to a super-election year? 

From the United States to Indonesia, from India to the European Parliament, more than 2 billion people – nearly one-quarter of the world’s population – will head to the polls in 2024.

With rapid technological developments in generative AI, headline-grabbing global conflicts dividing society, and democracy in decline, the potential that mis- and disinformation, hate speech, and foreign information manipulation will impact the integrity of these elections is a clear and present danger. In January, the World Economic Forum went so far as to name mis- and disinformation as the top short-term global risks to societies around the globe.

It’s also why turbocharging DSA enforcement this year will become so crucial. Here’s what we’ll be keeping a close eye on now that the DSA has taken full effect 

DSA enforcement at the member-state level, and the architecture of the DSCs  

A number of member states have already adopted and passed legislation naming the national authority that will become its DSC under the DSA; others are still in the process of doing so. This means that the DSA will take full effect without all member state regulators up and running, leaving the potential for hiccups in the first months of full-scale enforcement. 

What’s more, member states have been left with a lot of discretion in naming which national authority will take on the role of DSCsome have chosen existing media regulators, others data protection authorities, all with varying degrees of competency in working with online service providers, as well as the human and technical resources needed for the job.  

In some member states, questions remain about the independence of regulators vis-à-vis government and industry interests. Add to that differences in member state rules about what constitutes hate speech or illegal online content, and it becomes clear that the European Commission will have a mammoth task in coordinating and organising member state DSCs; and watchdog organisations will need to sharpen their oversight of how DSCs operate across the bloc in order to ensure worthwhile enforcement. 

Resources and data access: The Achilles’ heel of DSA enforcement 

Civil society organisations and independent researchers play a critical role in providing a check to official bodies’ enforcement of the DSA. Their role is actually codified throughout various articles of the DSA. Even so, on the eve of DSA enforcement, it remains to be seen how these roles will be realised in practice.  

For one, EU lawmakers are still finalising the rulebook on vetted researchers’ access to data from VLOPs/VLOSEs, and other official frameworks will be needed to provide guidance to DSCs about how they should process applications for civil society and other watchdog organisations to receive “trusted flagger” status.  

Even if guidance on the data access issue is expedited in time for the European Parliament elections in June, it’s unclear how these organisations will cobble together the resources needed to sustainably take on this work. The European Commission is considering issuing service contracts to well-known organisations to engage in DSA-related research into systemic risks. It’s also indicated that clarifying guidance for data access remains a top priority ahead of this year’s most closely watched European elections. But at the current juncture, question marks hover over this potential Achilles’ heel of successful DSA enforcement. 

Platforms innovate on systemic risk mitigationwhile failing to honour commitments 

Even before the DSA took full effect, many platforms signalled that they were already taking steps to address acute risks to election integrity. Both Google and Microsoft have joined Adobe, Intel and other tech companies in the Coalition for Content Provenance and Authenticity (C2PA), an industry cohort providing standards for tracing the origins of synthetic media spread online. Meta has announced plans to launch new technology across Facebook, Instagram and Threads this year capable of identifying and cataloguing AI-generated media produced by other firms’ AI tools. OpenAI, of ChatGPT fame, is slated to introduce a new set of tools to detect disinformation. For its part, TikTok announced plans to set up app-based “election centres” to combat misinformation and deepfakes ahead of the EU elections. Meanwhile, nearly all of these firms have joined a Biden Administration consortium dedicated to safe practices in AI development.  

That being said, Meta recently joined TikTok in suing the European Commission over the size of the levy to fund enforcement of the DSA. And a recent analysis by the European Fact-Checking Standards Network found that a number of VLOPs/VLOSEs have failed to uphold information integrity standards they committed to voluntarily under the Code of Practice against Disinformation, which is slated to become a binding Code of Conduct under the DSA.