CERRE’s Academic Director, Alexandre de Streel (Professor at the University of Namur) has co-authored a new study for the European Parliament’s IMCO Committee on online platforms’ moderation of illegal content. The study reviews and assesses the EU regulatory framework on content moderation and the practices by key online platforms.
The authors make recommendations to improve the EU legal framework within the context of the forthcoming Digital Services Act.
Extract
The EU regulatory framework on content moderation is increasingly complex and has been differentiated over the years according to the category of the online platform and the type of content reflecting a risk-based approach. The e-Commerce Directive of 2000 contains the baseline regime applicable to all categories of platforms and all types of content.
This baseline regulatory regime has been complemented in 2018 by the revised Audio-Visual Media Services Directive, which imposes more obligations to one category of online platforms, the Video Sharing Platforms. They should take appropriate and proportionate measures to protect the general public from illegal content (terrorist content, child sexual abuse material, racism and xenophobia or other hate speech), and to protect minors from harmful content. Those measures must be appropriate in the light of the nature of the content, the category of persons to be protected and the rights and legitimate interests at stake and be proportionate taking into account the size of the platforms and the nature of the provided service.
Those rules are strengthened by stricter rules for four types of content for which illegality has been harmonised at the EU level with the Counter-Terrorism Directive, the Child Sexual Abuse and Exploitation Directive, the Counter-Racism Framework Decision, and the Copyright in Digital Single Market Directive.
Those stricter rules imposed by EU hard-law are all complemented by self-regulatory initiatives agreed by the main online platforms, often at the initiative of the European Commission.
In addition to this multi-layered EU regulatory framework, several Member States have adopted national rules on online content moderation, in particular for hate speech and online disinformation. The legal compatibility of those national initiatives with the EU legal framework is not always clear and the multiplication of national laws seriously risks undermining the Digital Single Market.
Policy Recommendations for the Digital Services Act
The revised EU regulatory framework for online content moderation, which will result from the forthcoming Digital Services Act, could be based on the following objectives and principles:
- sufficient and effective safeguards to protect fundamental rights;
- a strengthening of the Digital Single Market;
- a level playing field between offline and online activities;
- technological neutrality;
- incentives for all stakeholders to minimise the risk of errors of over and under removal of content;
- proportionality of the potential negative impact of the content and the size of the platforms; and
- coherence with existing content-specific EU legislation.
The baseline regulatory regime applicable to all types of content and all categories of platforms could strengthen in an appropriate and proportionate manner the responsibility of the online platforms to ensure a safer Internet. To do that, it could include a set of fully harmonised rules on procedural accountability to allow public oversight of the way in which platforms moderate content.
Those rules could include: (i) common EU principles to improve and harmonise the ‘notice-and-takedown’ procedure to facilitate reporting by users; (ii) the encouragement for the platforms to take, where appropriate, proportionate, specific proactive measures including with automated means; and (iii) the strengthening of the cooperation with public enforcement authorities.
Those new rules could be based on the measures recommended by the European Commission in its 2018 Recommendation on measures to effectively tackle illegal online content as well as on the measures imposed on VideoSharing Platforms by the revised 2018 Audio-Visual Media Services Directive.
This baseline regulatory regime could be complemented with stricter rules imposing more obligations, when the risk of online harm is higher. Stricter rules are already imposed according to the type of content: more obligations are imposed for the moderation of the online content with the highest potential negative impact on the society such as terrorist content, child sexual abuse material, racist and xenophobic hate speech and some copyright violations.
Stricter rules could also be imposed according to the size of the platform: more obligations could be imposed on the platforms whose number of users is above a certain threshold, which could be designated as Public Space ContentSharing Platforms (PSCSPs). As often in EU law, enforcement is the weak spot and therefore, the forthcoming Digital Services Act should ensure that any online content moderation rule is enforced effectively.
The ‘country of origin principle should be maintained, hence the online platforms should in principle be supervised by the authorities of the country where they are established. However, the authorities of the country of establishment may not have sufficient means and incentives to supervise the largest platforms; hence, an EU authority could be set up to supervise the PSCSPs. In addition, the enforcement could be improved with, on the one hand, better coordination between national Online Platforms’ Moderation of Illegal Content Online 13 PE 652.718 authorities by relying on the Consumer Protection Cooperation Network and, on the other hand, better information disclosure in the context of Court proceedings.
Given the massive explosion of online content, public authorities may not be sufficiently well geared to ensure the enforcement of content moderation rules and may need to be complemented with private bodies. Those could be the platforms themselves, self-regulatory bodies or co-regulatory bodies.
Next to specific obligations regarding the moderation of illegal content online, complementary broader measures are also necessary such as more transparency on the way moderation is done and support to journalists, Civil Society Organisation or NGOs, which contribute to the fight against illegal content.