Skip to content
CERRE think tank Logo
  • About us
    • About CERRE
    • Our team
    • Board of Directors
    • Careers
    • Transparency & Independence
    • FAQs
  • Areas of expertise
    • Energy, Mobility & Sustainability
    • Tech, Media, and Telecommunications
    • Cross-sector
  • Publications
    • Ambitions for EU 2024 – 2029
    • Global Governance for the Digital Ecosystems
  • Events
    • Upcoming events
    • Past events
  • Blogposts
  • Insights
  • Media Room
    • Press Releases
    • Press Coverage
  • Membership
    • Our members
    • Become a member
  • Contact
  • About us
    • About CERRE
    • Our team
    • Board of Directors
    • Careers
    • Transparency & Independence
    • FAQs
  • Areas of expertise
    • Energy, Mobility & Sustainability
    • Tech, Media, and Telecommunications
    • Cross-sector
  • Publications
    • Ambitions for EU 2024 – 2029
    • Global Governance for the Digital Ecosystems
  • Events
    • Upcoming events
    • Past events
  • Blogposts
  • Insights
  • Media Room
    • Press Releases
    • Press Coverage
  • Membership
    • Our members
    • Become a member
  • Contact
Filter by Sectors





Publications
#Tech, Media, Telecom

Harmful Online Choice Architecture

  • May 29, 2024
Share.
Document(s)
Download the report here

The design of online user interfaces plays a pivotal role in shaping consumer choices. Terms such as ‘manipulative’, ‘malicious’, and ‘deceptive’ have been used to illustrate the method by which design distorts a consumer’s ability to make autonomous and informed choices. While these descriptors are often used as exemplary, rather than required criteria, their use underscores the significance of addressing deceptive practices and harmful choice architecture.
 

EU Legislative Landscape: 

Various legal texts in the EU consumer acquis address the issue of choice architecture, each with a differing scope and approach. The Consumer Rights Directive, the Data Act, and the Digital Services Act are among the texts that refer to safeguarding consumer autonomy in the digital sphere. Additionally, provisions within the GDPR, the UCPD, and the DMA contribute to the overarching framework governing the way choices are framed online. 

The European Commission is close to publishing its digital fairness fitness check of EU consumer law, which could become a stepping stone for a novel “Digital Fairness Act” to be presented early in the next Commission’s mandate. A key issue to be considered as part of this review is what is colloquially termed “dark patterns” (alternatively, “harmful user interface design” or “online choice architecture”) and their qualification as an unfair commercial practice.
 

The CERRE Report: 

In this new report, CERRE Research Fellows Christoph Busch and Amelia Fletcher delve into the topic of choice architecture, with a specific focus on deceptive patterns and their associated harm.  

By critically evaluating existing frameworks, identifying emerging challenges, and proposing targeted interventions, policymakers and stakeholders can collaboratively navigate the evolving landscape of digital commerce, fostering trust and empowering consumers in the digital age. 

Through their analysis of the current regulatory framework, examination of industry best practices, the authors propose actionable measures to address deceptive user interface design. 

The ten principles identified by the authors are:

  • Principle 1: Do not restrict regulation to only addressing ‘intentional’ harmful effects.
  • Principle 2: Regulation should be clear about the ‘mechanism of effect on users’, but not be restricted only to ‘deceptive’ online choice architecture.
  • Principle 3: Regulation should be clear about the nature of the harm involved, and who it pertains to.
  • Principle 4: Recognise intrinsic limits to informed and autonomous decision-making.
  • Principle 5: Recognise that context is important for assessing online choice architecture – it can be beneficial, as well as harmful, and can be used positively.
  • Principle 6: Exercise of rights should be easy and not undermined by online choice architecture.
  • Principle 7: Ensure that regulation addresses online choice architecture across multiple user path elements.
  • Principle 8: Consider special rules for automated personalised choice architecture.
  • Principle 9: Behavioural testing should be encouraged, or even required in specific circumstances, and regulators should be able to access test results.
  • Principle 10: Mitigate risks of regulatory overlap or inconsistency.
Author(s)
Loading...
Christoph Busch (3)
Christoph Busch
Research Fellow
and University of Osnabrück

Christoph Busch is Professor of Law and Director of the European Legal Studies Institute at the University of Osnabrück, Germany. He is a Fellow and Council Member of the European Law Institute (ELI) and an Affiliated Fellow at the Information Society Project at Yale University. His research focuses on consumer law, platform governance and algorithmic regulation.

Christoph Busch is Professor of Law and Director of the European Legal Studies Institute at the University of Osnabrück, Germany. He is a Fellow and Council Member of the European Law Institute (ELI) and an Affiliated Fellow at the Information Society Project at Yale University. His research focuses on consumer law, platform governance and algorithmic regulation.

Amelia Fletcher (3)
Amelia Fletcher
Research Fellow
and University of East Anglia

Amelia Fletcher CBE is a Professor of Competition Policy at the Centre for Competition Policy, University of East Anglia and co-editor of the Journal of Competition Law and Economics. She also acts as an expert witness.

She has been a Non-Executive Director at the UK Competition and Markets Authority (2016-2023), Financial Conduct Authority (2013-20) and Payment Systems Regulator (2014-20), and a member of Ofgem’s Enforcement Decision Panel (2014-2022). She has also been a member of DG Comp’s Economic Advisory Group on Competition Policy, and was a member of the Digital Competition Expert Panel, commissioned by the UK Treasury and led by Jason Furman, which reported in March 2019.

She was previously Chief Economist at the Office of Fair Trading (2001-2013), where she also spent time leading the OFT’s Mergers and Competition Policy teams. Before joining the OFT, she was an economic consultant at Frontier Economics (1999-2001) and London Economics (1993-1999).

She has written and presented widely on competition and consumer policy. In her ongoing research, Amelia has a particular interest in the implications for competition and consumer policy of behavioural economics and online markets.

Amelia has a DPhil and MPhil in economics from Nuffield College, Oxford.

Amelia Fletcher CBE is a Professor of Competition Policy at the Centre for Competition Policy, University of East Anglia and co-editor of the Journal of Competition Law and Economics. She also acts as an expert witness.

She has been a Non-Executive Director at the UK Competition and Markets Authority (2016-2023), Financial Conduct Authority (2013-20) and Payment Systems Regulator (2014-20), and a member of Ofgem’s Enforcement Decision Panel (2014-2022). She has also been a member of DG Comp’s Economic Advisory Group on Competition Policy, and was a member of the Digital Competition Expert Panel, commissioned by the UK Treasury and led by Jason Furman, which reported in March 2019.

She was previously Chief Economist at the Office of Fair Trading (2001-2013), where she also spent time leading the OFT’s Mergers and Competition Policy teams. Before joining the OFT, she was an economic consultant at Frontier Economics (1999-2001) and London Economics (1993-1999).

She has written and presented widely on competition and consumer policy. In her ongoing research, Amelia has a particular interest in the implications for competition and consumer policy of behavioural economics and online markets.

Amelia has a DPhil and MPhil in economics from Nuffield College, Oxford.

More publications

on #Tech, Media, Telecom

DMA@1: Looking Back and Ahead
26 March 2025
DSA Implementation Forum: Protection of Minors
25 March 2025
AI Act Implementation Forum: Legal Principles and Technical Requirements
4 February 2025
Which Governance Mechanisms for Open Tech Platforms?
28 January 2025
Better Law-Making and Evaluation for the EU Digital Rulebook
22 January 2025
Navigating the Revolution: Policy Recommendations for Inclusive AI
21 January 2025
Shaping the Future of European Consumer Protection: Towards a Digital Fairness Act?
3 December 2024
Systemic Risk in Digital Services: Benchmarks for Evaluating Management of Risk of Terrorist Content Dissemination
27 November 2024
AI Agents and Ecosystems Contestability
5 November 2024
Resilience In Digital Supply Chains:  Opportunities for Global and International Governance
30 September 2024

Stay informed

Subscribe to our newsletter for our latest updates

Subscribe now

Centre on Regulation in Europe asbl (CERRE)

Avenue Louise, 475 (box 10)
1050 Brussels, Belgium
T.: +32 2 230 83 60
E-mail: info@cerre.eu  

Linkedin-in Youtube Link
  • Copyright CERRE 2010-2024
  • BE 0824446055 RPM Bruxelles
About
  • About Us
  • Team
  • Board of Directors
  • Annual review
  • Careers
  • Transparency & Independence
  • FAQs
Expertise
  • Energy, Mobility & Sustainability
  • Tech, Media, Telecom
  • Cross-sector
More
  • Publications
  • Events
  • Blogposts
  • Insights
  • Privacy & Legals
  • Cookie Policy

Centre on Regulation in Europe asbl (CERRE)

Avenue Louise, 475 (box 10)
B-1050 Brussels – Belgium
T.: +3222308360
E-mail: info@cerre.eu 

BE 0824446055 RPM Bruxelles

Linkedin-in Youtube
About
  • About Us
  • Team
  • Board of directors
  • Annual review
  • Careers
  • Transparency & Independence
  • FAQs
Expertise
  • Energy & Sustainability
  • Tech, Media, Telecom
  • Mobility
  • Cross-sector
More
  • Publications
  • Events
  • News & insights
  • Our members
  • Become a member

This website uses cookies to ensure you get the best experience.

OK
CERRE Privacy Policy