Digital Services Act – a way out of the disinformation crisis?

//Tahireh Panahi//

In times of polycrisis, the EU is also experiencing a disinformation crisis, since state actors, troll armies, politicians, but also private individuals are deliberately spreading untrue factual claims on social media. Consequences are a loss of trust in democracy and the media, polarisation, and destabilisation of society. As disinformation is closely linked to hate speech, marginalised groups are particularly affected and can be permanently traumatised, even threatened in their existence.1 There are already initial cases that show that the upcoming European elections will be significantly affected by disinformation campaigns.2

A key factor in the spread of disinformation is the technical design of social media, which allows false information to be distributed quickly, en masse, directly and, above all, negligently. Service providers contribute to this problem by maintaining attention economy, non-transparent algorithms and negligent or even arbitrary content moderation. On top of this, there are new forms of AI generated information manipulation, such as deep fakes.

However, legislators face a dilemma when it comes to regulating disinformation: on the one hand, disinformation can impair the free opinion-forming process, which is the basis for fundamental communication rights and democracy (Art. 11 para. 1 CFR, Art. 10 para. 1 ECHR; Art. 2 TEU). On the other hand, measures against disinformation can also affect the same legal goods.

The EU has risen to this challenge and has made major changes to platform regulation, which also include the topic of disinformation. The Digital Services Act (DSA)3, which is the central regulation in this area, is being labelled by EU representatives with very grand expressions, e.g. “a new global golden standard for tech-regulation“.4 Since August 2023, the DSA has already applied to “Very Large Online Platforms/”Search Engines” (VLOP/VLOSE). From 17. February 2024, the DSA will also apply to all other intermediary services that offer their services in the EU.

Digital Services Act

The DSA is an EU regulation intended to harmonise the requirements for platforms in the EU. According to Recital 9 it aims to ensure a safe, predictable and trusted online environment and to tackle disinformation. The DSA will be supplemented by other regulations relating to disinformation, e.g. AI Act, Media Freedom Act, Regulation on transparency and targeting of political advertising and different Codes of Conduct. The trend shown by these regulations is towards a standardisation of the legal framework for disinformation in the EU, while leaving some space for specific regulation by the Member States.

The addressees of the DSA are intermediary services, Art. 3 lit. g DSA, namely mere conduit services, caching and hosting services (the latter include online platforms).5 There are also special regulations for “Very Large Online Platforms/Search Engines” (VLOP/VLOSE), which are identified on the basis that their average monthly active recipients base reaches at least 45 million EU citizens and are designated as such by the Commission (Art. 33 para. 1 DSA).

Private Messaging services are excluded from the scope of the DSA (Recital 14). However, public communication functions of “hybrid media”6, such as Telegram, may fall under the DSA, as Recital 15 provides for a function-based addressing of recipients.

Measures against disinformation

At first glance, it is noticeable that the DSA does not contain a legal definition of disinformation, nor does the regulation specify which concrete types of disinformation are unlawful.

Nevertheless, the DSA contains measures against disinformation, which cover the range from repressive to preventive measures. Somes could even be called “constructive measures”, as they address the improvement of the technical design of services. Some of the most relevant provisions are presented as examples.

Repressive measures

Notice and action mechanism, Art. 16 DSA

Art. 16 DSA obliges hosting services to put notice and action mechanisms in place, which must fulfil various requirements in terms of user-friendliness. It should be noted that these mechanism does not apply to disinformation per se, but to all content illegal in the EU. However, the Union and the Member States can declare (certain forms of) disinformation illegal by legislation. Only then will the notice and action mechanisms apply to disinformation.
Furthermore, there is no explicit obligation to delete notified illegal content, but to “process” notices (Art. 16 para. 6). According to Recital 54, this leaves scope for various reactions (explicitly mentioned: removal, disabling access, restriction of visbility and demonetisation). There is also no numerical deadline for this action. According to Art. 16 para. 3 DSA notices made under Art. 16 DSA can lead to an exception to the liability privilege of Art. 6 para.1 DSA.

Measures and protection against misuse, Art. 23 para. 1 DSA

According to Art. 23 para. 1 DSA online platforms shall suspend their services to recipients who frequently provide manifestly illegal content for a reasonable period of time after prior warning. This obligation also relates to illegal content, not to disinformation. The provision leaves a certain degree of freedom for platforms to define their own criteria for suspension, but requires transparency in return. Arbitrary membership moderation is thus restricted. Although the suspension of services may be a blatant interference with the fundamental communication rights given by Art. 11 para. 1 CFR, Art. 10 para. 1 ECHR – especially in the case of VLOPs, as the user is deprived of the ability to communicate in structurally significant platforms -, such interference is likely to be justified due to the numerous transparency and proportionality requirements as well as procedural rights that Art. 20 and 23 DSA guarantee.

Crisis response mechanism, Art. 36 DSA

In the event of a crisis, the Commission can oblige VLOPs to take actions described in Art. 36 para. 1 lit. a – c DSA. The legal definition of crisis as “extraordinary circumstances [that] lead to a serious threat to public security or public health” given in Art. 36 para. 2 DSA remains vague and is at the discretion of the Commission. It is conceivable that mass disinformation could also constitute a crisis in this regard. According to Art. 36 para. 3 lit. c DSA the crisis response mechanism is limited to a period not exceeding three months.

Preventive measures

The DSA contains a large number of preventive measures, in particular transparency obligations, which can at least indirectly counter the dissemination of disinformation. These include transparency reports (e.g. Artt. 15, 24 DSA) and transparency of recommender systems (Art. 27 DSA). In addition, general terms and conditions must be transparent and contain information about any restrictions according to Art. 14 DSA.

Constructive measures

Pursuant to Art. 34, 35 DSA VLOPs are required to analyse, asses and mitigate systemic risks, including disinformation (Recital 84). These requirements refers to the basic characteristics of the platforms and thus to the organisation of the online discourse, as VLOPs must consider the design, functioning, including algorithmic systems, and use made of their services (Art. 34 para. 1 DSA). In a second step, VLOPs must take proportionate and effective risk mitigation measures in accordance with Art. 35 para. 1 lit- a – k DSA, e.g. adapting their terms and conditions and design, features and functions of their services. A critical point is that the assessment of systemic risks in fact requires large-scale monitoring, which contradicts Art. 8 DSA.

Conclusion

The DSA contains a number of repressive and preventive measures to counteract disinformation. As disinformation is not illegal in general, the effectiveness of the provisions on notice and action systems and membership-suspension obligations depends very much on the regulatory commitment of the Member States to identify illegal forms of disinformation. However, the following must be considered: As repressive measures can not only lead to encroachments on fundamental communication rights, but also push users who spread unlawful forms of disinformation into “invisible areas” of social media (e.g. private messenger services) and lead to a backfire effect, that increases polarisation of society, such measures must be ultima ratio. This is also why the requirements of proportionality and procedural rights for users given by the DSA must be a priority in the legal supervision of services.

On the other hand, the risk assessment obligations offers a more targeted approach to disinformation, which could lead to changes of the technical design and decision making of the VLOPs and thus enhance the online discourse environment.

A final critical point is that the DSA contains numerous vague formulations, that lead to legal uncertainty for service providers and on the other hand confer a too broad discretion on the supervisory authorities.

Tahireh Panahi is scientific researcher at the University of Kassel, Germany.

1 UNHRC, Report of the Special Rapporteur on minority issues, 3.3.2021.

3 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065 .

5 just as in the previous E-Commerce Directive.

6 Services that combine public and private communications in one interface.