CERRE think tank Logo
  • About us
    • About CERRE
    • Our team
    • Board of directors
    • Our members
    • Annual Review
    • Careers
    • FAQs
  • Areas of expertise
    • Energy & Climate
    • Mobility
    • Tech, Media, Telecom
    • Water
    • Cross-sector
  • Publications
  • Events
  • News
  • Membership
    • Our members
    • Become a member
  • Contact
Menu
  • About us
    • About CERRE
    • Our team
    • Board of directors
    • Our members
    • Annual Review
    • Careers
    • FAQs
  • Areas of expertise
    • Energy & Climate
    • Mobility
    • Tech, Media, Telecom
    • Water
    • Cross-sector
  • Publications
  • Events
  • News
  • Membership
    • Our members
    • Become a member
  • Contact
Search
Generic filters
Filter by Sectors






News & Insights
#Tech, Media, Telecom

STUDY: Online platforms’ moderation of illegal content

  • 29 June 2020
Share on whatsapp
Share on linkedin
Share on email
Share on facebook
Share on twitter

CERRE’s Academic Director, Alexandre de Streel (Professor at the University of Namur) has co-authored a new study for the European Parliament’s IMCO Committee on online platforms’ moderation of illegal content. The study reviews and assesses the EU regulatory framework on content moderation and the practices by key online platforms.

The authors make recommendations to improve the EU legal framework within the context of the forthcoming Digital Services Act.

Extract

The EU regulatory framework on content moderation is increasingly complex and has been differentiated over the years according to the category of the online platform and the type of content reflecting a risk-based approach. The e-Commerce Directive of 2000 contains the baseline regime applicable to all categories of platforms and all types of content.

This baseline regulatory regime has been complemented in 2018 by the revised Audio-Visual Media Services Directive, which imposes more obligations to one category of online platforms, the Video Sharing Platforms. They should take appropriate and proportionate measures to protect the general public from illegal content (terrorist content, child sexual abuse material, racism and xenophobia or other hate speech), and to protect minors from harmful content. Those measures must be appropriate in the light of the nature of the content, the category of persons to be protected and the rights and legitimate interests at stake and be proportionate taking into account the size of the platforms and the nature of the provided service.

Those rules are strengthened by stricter rules for four types of content for which illegality has been harmonised at the EU level with the Counter-Terrorism Directive, the Child Sexual Abuse and Exploitation Directive, the Counter-Racism Framework Decision, and the Copyright in Digital Single Market Directive.

Those stricter rules imposed by EU hard-law are all complemented by self-regulatory initiatives agreed by the main online platforms, often at the initiative of the European Commission.

In addition to this multi-layered EU regulatory framework, several Member States have adopted national rules on online content moderation, in particular for hate speech and online disinformation. The legal compatibility of those national initiatives with the EU legal framework is not always clear and the multiplication of national laws seriously risks undermining the Digital Single Market.

Policy Recommendations for the Digital Services Act

The revised EU regulatory framework for online content moderation, which will result from the forthcoming Digital Services Act, could be based on the following objectives and principles:

  • sufficient and effective safeguards to protect fundamental rights;
  • a strengthening of the Digital Single Market;
  • a level playing field between offline and online activities;
  • technological neutrality;
  • incentives for all stakeholders to minimise the risk of errors of over and under removal of content;
  • proportionality of the potential negative impact of the content and the size of the platforms; and
  • coherence with existing content-specific EU legislation.

The baseline regulatory regime applicable to all types of content and all categories of platforms could strengthen in an appropriate and proportionate manner the responsibility of the online platforms to ensure a safer Internet. To do that, it could include a set of fully harmonised rules on procedural accountability to allow public oversight of the way in which platforms moderate content.

Those rules could include: (i) common EU principles to improve and harmonise the ‘notice-and-takedown’ procedure to facilitate reporting by users; (ii) the encouragement for the platforms to take, where appropriate, proportionate, specific proactive measures including with automated means; and (iii) the strengthening of the cooperation with public enforcement authorities.

Those new rules could be based on the measures recommended by the European Commission in its 2018 Recommendation on measures to effectively tackle illegal online content as well as on the measures imposed on VideoSharing Platforms by the revised 2018 Audio-Visual Media Services Directive.

This baseline regulatory regime could be complemented with stricter rules imposing more obligations, when the risk of online harm is higher. Stricter rules are already imposed according to the type of content: more obligations are imposed for the moderation of the online content with the highest potential negative impact on the society such as terrorist content, child sexual abuse material, racist and xenophobic hate speech and some copyright violations.

Stricter rules could also be imposed according to the size of the platform: more obligations could be imposed on the platforms whose number of users is above a certain threshold, which could be designated as Public Space ContentSharing Platforms (PSCSPs). As often in EU law, enforcement is the weak spot and therefore, the forthcoming Digital Services Act should ensure that any online content moderation rule is enforced effectively.

The ‘country of origin principle should be maintained, hence the online platforms should in principle be supervised by the authorities of the country where they are established. However, the authorities of the country of establishment may not have sufficient means and incentives to supervise the largest platforms; hence, an EU authority could be set up to supervise the PSCSPs. In addition, the enforcement could be improved with, on the one hand, better coordination between national Online Platforms’ Moderation of Illegal Content Online 13 PE 652.718 authorities by relying on the Consumer Protection Cooperation Network and, on the other hand, better information disclosure in the context of Court proceedings.

Given the massive explosion of online content, public authorities may not be sufficiently well geared to ensure the enforcement of content moderation rules and may need to be complemented with private bodies. Those could be the platforms themselves, self-regulatory bodies or co-regulatory bodies.

Next to specific obligations regarding the moderation of illegal content online, complementary broader measures are also necessary such as more transparency on the way moderation is done and support to journalists, Civil Society Organisation or NGOs, which contribute to the fight against illegal content.

Document(s)
Study requested by the IMCO committee: Online Platforms' Moderation of Illegal Content Online
Author(s)
Alexandre De Streel
Alexandre de Streel
CERRE Academic Co-Director
Professor of EU Law, University of Namur

Alexandre de Streel is Academic Co-Director at CERRE and a professor of European law at the University of Namur and the Research Centre for Information, Law and Society (CRIDS/NADI). He is a Hauser Global Fellow at New York University (NYU) Law School and visiting professor at the European University Institute, SciencesPo Paris and Barcelona Graduate School of Economics, and also assessor at the Belgian Competition Authority.

His main areas of research are regulation and competition policy in the digital economy as well as the legal issues raised by the developments of artificial intelligence. Recently, he advised the European Commission and the European Parliament on the regulation of online platforms.

Previously, Alexandre worked for the Belgian Deputy Prime Minister, the Belgian Permanent Representation to the European Union and the European Commission (DG CONNECT). He holds a Ph.D. in Law from the European University Institute and a Master’s Degree in Economics from the University of Louvain.

More news

on #Tech, Media, Telecom

Alexandre de Streel in Pro Market on the Digital Markets Act

15 Jan 2021

CERRE in the Journal of Competition Law & Economics

08 Dec 2020

CERRE’s recommendations for the Digital Markets and Digital Services Acts

30 Nov 2020

Welcoming Lara Natale: CERRE’s Director for Tech, Media, Telecom

21 Sep 2020

JRC study: Business-to-Business data sharing

23 Jul 2020

VIDEO DEBATE: Online platforms’ content moderation

16 Jul 2020

Two years of GDPR: where does Europe stand on data portability?

24 Jun 2020

DEBATE: Global online platforms – which regulatory models?

22 Jun 2020

INTERVIEW: How to make competition policy fit for the digital age?

03 Jun 2020

DEBATE: Mark Zuckerberg & Thierry Breton

18 May 2020

Stay informed

Subscribe to our newsletters for our latest updates

Centre on Regulation in Europe asbl (CERRE)

Avenue Louise, 475 (box 10)
B-1050 Brussels – Belgium
T.: +32 2 230 83 60
E-mail: info@cerre.eu

Twitter
Linkedin-in
Youtube
  • Copyright CERRE 2010-2020
  • BE 0824446055 RPM Bruxelles
About
  • About Us
  • Team
  • Our members
  • Board of Directors
  • Annual review
  • Careers
  • FAQs
Expertise
  • Energy & Climate
  • Tech, Media, Telecom
  • Mobility
  • Cross-sector
  • Water
More
  • Publications
  • Events
  • News & Insights
  • Privacy & Legals

Centre on Regulation in Europe asbl (CERRE)
Avenue Louise, 475 (box 10)
B-1050 Brussels – Belgium.
T.: +32 2 230 83 60
E-mail: info@cerre.eu

Twitter
Linkedin-in
Youtube
About
  • About Us
  • Team
  • Board of directors
  • Annual review
  • Careers
  • FAQs
Expertise
  • Energy & Climate
  • Tech, Media, Telecom
  • Mobility
  • Cross-sector
  • Water
More
  • Publications
  • Events
  • News & insights
  • Our members
  • Become a member

CERRE NEWSLETTER

Thank you for your interest in our activities!
You should receive a confirmation link in your mailbox. 

Any question?

Contact us

This website uses cookies to ensure you get the best experience.

OK
CERRE Privacy Policy