Skip to content
CERRE think tank Logo
  • About us
    • About CERRE
    • Our team
    • Board of Directors
    • Careers
    • Transparency & Independence
    • FAQs
  • Areas of expertise
    • Energy, Mobility & Sustainability
    • Tech, Media, and Telecommunications
    • Cross-sector
  • Publications
    • Ambitions for EU 2024 – 2029
    • Global Governance for the Digital Ecosystems
  • Events
    • Upcoming events
    • Past events
  • Blogposts
  • Insights
  • Media Room
    • Press Releases
    • Press Coverage
  • Membership
    • Our members
    • Become a member
  • Contact
  • About us
    • About CERRE
    • Our team
    • Board of Directors
    • Careers
    • Transparency & Independence
    • FAQs
  • Areas of expertise
    • Energy, Mobility & Sustainability
    • Tech, Media, and Telecommunications
    • Cross-sector
  • Publications
    • Ambitions for EU 2024 – 2029
    • Global Governance for the Digital Ecosystems
  • Events
    • Upcoming events
    • Past events
  • Blogposts
  • Insights
  • Media Room
    • Press Releases
    • Press Coverage
  • Membership
    • Our members
    • Become a member
  • Contact
Filter by Sectors





News & Insights
#Tech, Media, Telecom

STUDY: Online platforms’ moderation of illegal content

  • 29 June 2020

CERRE’s Academic Director, Alexandre de Streel (Professor at the University of Namur) has co-authored a new study for the European Parliament’s IMCO Committee on online platforms’ moderation of illegal content. The study reviews and assesses the EU regulatory framework on content moderation and the practices by key online platforms.

The authors make recommendations to improve the EU legal framework within the context of the forthcoming Digital Services Act.

Extract

The EU regulatory framework on content moderation is increasingly complex and has been differentiated over the years according to the category of the online platform and the type of content reflecting a risk-based approach. The e-Commerce Directive of 2000 contains the baseline regime applicable to all categories of platforms and all types of content.

This baseline regulatory regime has been complemented in 2018 by the revised Audio-Visual Media Services Directive, which imposes more obligations to one category of online platforms, the Video Sharing Platforms. They should take appropriate and proportionate measures to protect the general public from illegal content (terrorist content, child sexual abuse material, racism and xenophobia or other hate speech), and to protect minors from harmful content. Those measures must be appropriate in the light of the nature of the content, the category of persons to be protected and the rights and legitimate interests at stake and be proportionate taking into account the size of the platforms and the nature of the provided service.

Those rules are strengthened by stricter rules for four types of content for which illegality has been harmonised at the EU level with the Counter-Terrorism Directive, the Child Sexual Abuse and Exploitation Directive, the Counter-Racism Framework Decision, and the Copyright in Digital Single Market Directive.

Those stricter rules imposed by EU hard-law are all complemented by self-regulatory initiatives agreed by the main online platforms, often at the initiative of the European Commission.

In addition to this multi-layered EU regulatory framework, several Member States have adopted national rules on online content moderation, in particular for hate speech and online disinformation. The legal compatibility of those national initiatives with the EU legal framework is not always clear and the multiplication of national laws seriously risks undermining the Digital Single Market.

Policy Recommendations for the Digital Services Act

The revised EU regulatory framework for online content moderation, which will result from the forthcoming Digital Services Act, could be based on the following objectives and principles:

  • sufficient and effective safeguards to protect fundamental rights;
  • a strengthening of the Digital Single Market;
  • a level playing field between offline and online activities;
  • technological neutrality;
  • incentives for all stakeholders to minimise the risk of errors of over and under removal of content;
  • proportionality of the potential negative impact of the content and the size of the platforms; and
  • coherence with existing content-specific EU legislation.

The baseline regulatory regime applicable to all types of content and all categories of platforms could strengthen in an appropriate and proportionate manner the responsibility of the online platforms to ensure a safer Internet. To do that, it could include a set of fully harmonised rules on procedural accountability to allow public oversight of the way in which platforms moderate content.

Those rules could include: (i) common EU principles to improve and harmonise the ‘notice-and-takedown’ procedure to facilitate reporting by users; (ii) the encouragement for the platforms to take, where appropriate, proportionate, specific proactive measures including with automated means; and (iii) the strengthening of the cooperation with public enforcement authorities.

Those new rules could be based on the measures recommended by the European Commission in its 2018 Recommendation on measures to effectively tackle illegal online content as well as on the measures imposed on VideoSharing Platforms by the revised 2018 Audio-Visual Media Services Directive.

This baseline regulatory regime could be complemented with stricter rules imposing more obligations, when the risk of online harm is higher. Stricter rules are already imposed according to the type of content: more obligations are imposed for the moderation of the online content with the highest potential negative impact on the society such as terrorist content, child sexual abuse material, racist and xenophobic hate speech and some copyright violations.

Stricter rules could also be imposed according to the size of the platform: more obligations could be imposed on the platforms whose number of users is above a certain threshold, which could be designated as Public Space ContentSharing Platforms (PSCSPs). As often in EU law, enforcement is the weak spot and therefore, the forthcoming Digital Services Act should ensure that any online content moderation rule is enforced effectively.

The ‘country of origin principle should be maintained, hence the online platforms should in principle be supervised by the authorities of the country where they are established. However, the authorities of the country of establishment may not have sufficient means and incentives to supervise the largest platforms; hence, an EU authority could be set up to supervise the PSCSPs. In addition, the enforcement could be improved with, on the one hand, better coordination between national Online Platforms’ Moderation of Illegal Content Online 13 PE 652.718 authorities by relying on the Consumer Protection Cooperation Network and, on the other hand, better information disclosure in the context of Court proceedings.

Given the massive explosion of online content, public authorities may not be sufficiently well geared to ensure the enforcement of content moderation rules and may need to be complemented with private bodies. Those could be the platforms themselves, self-regulatory bodies or co-regulatory bodies.

Next to specific obligations regarding the moderation of illegal content online, complementary broader measures are also necessary such as more transparency on the way moderation is done and support to journalists, Civil Society Organisation or NGOs, which contribute to the fight against illegal content.

Document(s)
Study requested by the IMCO committee: Online Platforms' Moderation of Illegal Content Online
Author(s)
Loading...
Alexandre De Streel (2)
Alexandre de Streel
Academic Director
and University of Namur

Alexandre de Streel is the Academic Director of the digital research programme at the Brussels think-tank Centre on Regulation in Europe (CERRE), professor of European law at the University of Namur and visiting professor at the College of Europe (Bruges) and SciencesPo Paris. He sits in the scientific committees of the Knight-Georgetown Institute (US), the European University Institute-Centre for a Digital Society (Italy) and the Mannheim Centre for Competition and Innovation (Germany).

His main research areas are regulation and competition policy in the digital economy (telecommunications, platforms and data) as well as the legal issues raised by the developments of artificial intelligence. He regularly advises the European Union and international organisations on digital regulation.

Previously, Alexandre held visiting positions at New York University Law School, the European University Institute in Florence, Panthéon-Assas (Singapore campus), Barcelona Graduate School of Economics and the University of Louvain. He also worked for the Belgian Deputy Prime Minister, the Belgian Permanent Representation to the European Union, and the European Commission. He has also been the chair of the expert group on the online platform economy, advising the European Commission.

Alexandre de Streel is the Academic Director of the digital research programme at the Brussels think-tank Centre on Regulation in Europe (CERRE), professor of European law at the University of Namur and visiting professor at the College of Europe (Bruges) and SciencesPo Paris. He sits in the scientific committees of the Knight-Georgetown Institute (US), the European University Institute-Centre for a Digital Society (Italy) and the Mannheim Centre for Competition and Innovation (Germany).

His main research areas are regulation and competition policy in the digital economy (telecommunications, platforms and data) as well as the legal issues raised by the developments of artificial intelligence. He regularly advises the European Union and international organisations on digital regulation.

Previously, Alexandre held visiting positions at New York University Law School, the European University Institute in Florence, Panthéon-Assas (Singapore campus), Barcelona Graduate School of Economics and the University of Louvain. He also worked for the Belgian Deputy Prime Minister, the Belgian Permanent Representation to the European Union, and the European Commission. He has also been the chair of the expert group on the online platform economy, advising the European Commission.

More news

on #Tech, Media, Telecom

Zach Meyers Discusses EU Tech Regulation on BS-TBS Japan

13/05/2025

Zach Meyers quoted in Euractiv on transatlantic impact of Google antitrust case

02/05/2025

Alexandre de Streel Featured in Tech Policy Press on EU Digital Regulation Trends

24/04/2025

CERRE Academic Co-Director on DW News on First DMA Fines

24/04/2025

CERRE’s Director of Research Featured in Politico on Digital Markets Act Enforcement

24/04/2025

Zach Meyers Featured in Reuters on EU Tech Regulation

23/04/2025

CERRE: Let’s learn from a first year of experience, improve the DMA implementation process, and ensure effective child protection under the DSA.

28/03/2025

Alexandre de Streel on EU Tech Regulation in Challenges

05/03/2025

Director of Research Zach Meyers on the EU’s Trade Strategy in The Parliament Magazine

27/02/2025

CERRE Director of Research Quoted on VAT in U.S. Trade Debate

21/02/2025

Stay informed

Subscribe to our newsletter for our latest updates

Subscribe now

Centre on Regulation in Europe asbl (CERRE)

Avenue Louise, 475 (box 10)
1050 Brussels, Belgium
T.: +32 2 230 83 60
E-mail: info@cerre.eu  

Linkedin-in Youtube Link
  • Copyright CERRE 2010-2020
  • BE 0824446055 RPM Bruxelles
About
  • About Us
  • Team
  • Board of Directors
  • Annual review
  • Careers
  • Transparency & Independence
  • FAQs
Expertise
  • Energy, Mobility & Sustainability
  • Tech, Media, Telecom
  • Cross-sector
More
  • Publications
  • Events
  • Blogposts
  • Insights
  • Privacy & Legals
  • Cookie Policy

Centre on Regulation in Europe asbl (CERRE)

Avenue Louise, 475 (box 10)
B-1050 Brussels – Belgium
T.: +3222308360
E-mail: info@cerre.eu 

BE 0824446055 RPM Bruxelles

Linkedin-in Youtube
About
  • About Us
  • Team
  • Board of directors
  • Annual review
  • Careers
  • Transparency & Independence
  • FAQs
Expertise
  • Energy & Sustainability
  • Tech, Media, Telecom
  • Mobility
  • Cross-sector
More
  • Publications
  • Events
  • News & insights
  • Our members
  • Become a member

This website uses cookies to ensure you get the best experience.

OK
CERRE Privacy Policy