The European Commission has published the first annual self-assessments conducted by signatories to its Code of Practice on Disinformation.
The EU’s Code of Practice on Disinformation was launched a year ago as part of the EU’s Action Plan against Disinformation, with signatories including Facebook, Google, Mozilla and Twitter; Microsoft later adopted the code in 2019. Online providers signing up to the code agreed to self-regulate to combat the creation and dissemination of disinformation on their platforms; and to complete regular self-assessments to ensure their adherence to EU standards on disinformation.
The self-assessments conducted by the signatories to the Code of Practice on Disinformation found transparency to have improved across platforms; and dialogue and engagement between signatories on the subject of their set policies against disinformation was found to have improved. However, the Commission noted that results were variable both in terms of concrete action taken by participants and in the depth of their self-assessment reports.
In a joint statement, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King and Commissioner for the Digital Economy and Society Mariya Gabriel said: “We welcome the publication of these self-assessments by the signatories to the Code of Practice on the implementation of their commitments. In particular, we commend the commitment of the online platforms to become more transparent about their policies and to establish closer cooperation with researchers, fact-checkers and Member States. However, progress varies a lot between signatories and the reports provide little insight on the actual impact of the self-regulatory measures taken over the past year as well as mechanisms for independent scrutiny.
“While the 2019 European Parliament elections in May were clearly not free from disinformation, the actions and the monthly reporting ahead of the elections contributed to limiting the space for interference and improving the integrity of services, to disrupting economic incentives for disinformation, and to ensuring greater transparency of political and issue-based advertising. Still, large-scale automated propaganda and disinformation persist and there is more work to be done under all areas of the code. We cannot accept this as a new normal.
“While the efforts of online platforms and fact-checkers can reduce harmful virality through platforms’ services, there is still an urgent need for online platforms to establish a meaningful cooperation with a wider range of trusted and independent organisations. Access to data provided so far still does not correspond to the needs of independent researchers. Finally, despite the important commitments made by all signatories, we regret that no additional platforms or corporate actors from the advertising sector have subscribed to the code.”
The European Commission is in the process of conducting its own comprehensive assessment of disinformation policy implementation and enforcement among the code’s signatories. The Commission’s full report is due for publication by early 2020.