Disinformation, as it is often said, is not a new phenomenon. What is new are the techniques used, the speed, and the spread since it is now shared mainly (but not only) on online social platforms. The starting point of the EU response to tackle disinformation is the respect of fundamental rights, such as freedom of expression. This is why disinformation as such is not considered illegal, and this is also why the first natural EU policy step was rather a self-regulatory one.
The Code of Practice on Disinformation was adopted in 2018 by several signatories, including the main online social media platforms and some organisations from the advertising industry. The Code proved to be a first useful tool, but its assessmentshowed several weaknesses. A Strengthened Code was drafted and adopted as a follow-up in June 2022.
Under the EU Better Regulation Toolbox, self-regulation is “where business or industry sectors formulate codes of conduct or operating constraints on their own initiative for which they are responsible for enforcing. However, pure self-regulation is uncommon, and, at the EU level, it generally involves the Commission in instigating or facilitating the drawing up of the voluntary agreement”. Indeed, as is the case for both versions of the Code, the European Commission played a fundamental role in facilitating their drafting and providing policy guidance.
The drafting process of the Strengthened Code happened in a particular historical moment which saw on one side the spread of disinformation concerning a pandemic (COVID-19) and a war (in Ukraine) and on the other side the latest EU regulatory developments, such as the Digital Services Act, the Digital Market Act, the European Media Freedom Act.
Not only is this new code addressing the weaknesses of its first edition, but it also envisions the enlargement of the stakeholders subscribing to it. As such, the drafting process involved a large number of players. To coordinate such work, an honest broker was appointed. Considering that, as mentioned, any response to disinformation needs to respect fundamental rights, it is not by chance that the role of the honest broker was given to a Professor of Constitutional Law, Prof. Oreste Pollicino.
Now, to understand the challenges of such an exercise, it is worth narrowing down on the stakeholders involved and on the issues at stake.
The first Code was signed (and owned) mainly by very large online platforms and by advertising agencies. The strengthened Code also includes among its signatories (and owners) independent fact-checking organisations, civil society organisations, other tech companies, and online platforms. Consequently, stakeholders with different interests and missions and different roles in the disinformation debate were sitting at the same table to draft a Code whose main purpose is to serve as a self-regulatory standards tool to tackle disinformation. As disinformation is also shared through other means, such as traditional media, some scholars made an argument that media players are the missing actor in it. There is a point in this argument, but also one in not including them in this kind of tool, which relates to the (non) liability of the players on which disinformation is shared.
Signatories of the Code considered their different roles and missions and subscribed only to those measures that were relevant to them.
The involvement of different players is in line with the broader approach to disinformation which is by its nature a multistakeholder one. However, it is also a multidisciplinary one, and the work carried out by, and the approach of scholars and policymakers confirm the importance of such a direction.
Just as a matter of example, without neurosciences telling us how our brain’s reaction differs in front of an image or in front of a text and telling us that people share more irrationally when in fear or anger, we would miss part of the understanding of the disinformation phenomenon. Without digital literacy experts providing sound pedagogical guidelines, building societal resilience would not be possible. Engineers and tech experts are fundamental to understanding and anticipating future developments. Legal and human rights experts explaining fundamental rights’ value and legal framework are key for public and private policy responses. Political and social scientists can provide an analysis of the broader picture (e.g., polarisation). It is indeed arguable that without economists, it would be difficult to understand the business models of players involved and the economic incentives behind disinformation. This non-exhaustive list shows why addressing disinformation needs a multifaceted community of experts and stakeholders.
Notwithstanding the position one might have on the Code and its contents, there is probably agreement on the uniqueness of this exercise. Hereby, some of them will be analysed.
The Code is a self-regulatory tool in EU terms, which, as mentioned above, means involvement of the European Commission. In addition, the Code of Practice might turn into a Code of Conduct under the DSA (Article 35) when the latter enters into force. The involvement of the EU legislator and the perspective of co-regulation could make this self-regulatory tool and its implementation a quite strong policy commitment. In addition, it could also serve as an opportunity for signatories affected by the DSA to start getting prepared and equipped. We will see in the next months if this proves to be the case or not.
Disinformation is a symptom of a broader disease, such as questioning democratic principles and societal polarisation and, more specifically, social, political, and economic instability. However, it is clear that online platforms, particularly the exploitation of their services to cause harm are part of the disinformation journey. The main social platforms active in the EU are committed to the Code, even though, in some cases, they also compete in the same market. As such, the Code is an interesting exercise of the private sector collaborating and converging on principles in light of public societal interest.
In addition, the debate under the Code is also a precious opportunity to dive deep into the platforms’ functioning and to understand which tactics, techniques, and practices better affect (because of its very structure and functioning) which platform.
The drafting of the Code included players with so many different missions and roles in the disinformation debate that it required very strong negotiation skills as well as a thorough knowledge of the sector. This contributed to the uniqueness of this exercise, as it put in the same room and with the same pen the platforms on which disinformation is spread, the advertisers who can gain from it, the fact-checkers who act ex-post once disinformation is shared (but also collaborate ex-ante with platforms), the civil society organisations who advocate for human and civil rights and act overall as watchdogs for democracy. All this was facilitated by the European Commission, which contributed to policy principles and was chaired by an academic.
The new Code of Practice is the result of all that. The next months will show how strong the signatories’ commitment is and, thanks to structural indicators, the impact the Code has on the disinformation phenomenon. Having so many signatories who are not all committing to the same measures and who pursue different missions could entail having too many cooks in the kitchen, thus reducing efforts and responsibilities. However, considering the ultimate goal of the Code, one might also expect a higher motivation in the interest of democratic principles. Moreover, the Code can also represent a precious opportunity to prepare for the DSA, which is not something to underestimate, considering the impact it will have on resources (also human ones) and business organisations.
*Paula Gori, European Digital Media Observatory, European University Institute.