Blog
5 min read

“Dark Patterns” Targeted by EU Institutions: A New Era of Digital Fairness

Published on
November 18, 2024

In the fast-evolving digital landscape, businesses and consumers alike are constantly engaging with online platforms and services. While the internet has enabled unprecedented access to products, information, and social networks, it has also opened the door to manipulation and deceit, often in the form of "dark patterns." These are design practices on websites and apps that trick users into making decisions they might not make freely, from sharing excessive personal data to purchasing products or services they don’t need, or paying a higher price than intended.

For years, the European Union has been a global frontrunner in advocating for digital rights and consumer protections, with regulatory frameworks such as GDPR (General Data Protection Regulation), the Unfair Commercial Practices Directive (UCPD) or the EU Consumer Rights Directive (CRD). These regulations apply to dark patterns in certain circumstances.

However, the recent introduction of the Digital Markets Act (DMA), Digital Services Act (DSA), and the AI Act marks a critical turning point. For the first time, these regulations expressly prohibit dark patterns in the EU.

Today, several major tech companies, including X (formerly Twitter), Meta, Shein, Temu, TikTok, and AliExpress, are facing legal action under the DSA, with accusations that include the use of dark patterns. This article delves into the EU’s stance on dark patterns, the significance of new regulations, and what these developments mean for businesses and users alike.

What’s the EU Stance on Dark Patterns?

The EU’s stance on dark patterns is clear: deceptive and manipulative design practices have no place in the digital economy. The EU has long considered dark patterns illicit under existing laws such as GDPR and consumer protection laws, but these new legislative frameworks—DMA, DSA, and the AI Act—explicitly prohibit dark patterns, underscoring the EU’s commitment to fair, transparent digital interactions.

Dark patterns, as defined by Dr. Harry Brignull who coined the term and authored Deceptive Patterns, exploit users’ cognitive biases and often involve manipulative or deceptive tactics to gain user consent for data sharing, prompt purchases, or other interactions that serve the interests of the platform over the interests of the user. Recognizing the prevalence and harmful impact of these patterns, the EU has expanded its legislative toolkit to tackle them directly.

Dark Patterns Have Been Illicit for a Long Time Under EU Law

Though recent EU legislation has sharpened the focus on dark patterns, the EU has long viewed these practices as illicit, especially under GDPR and consumer protection laws. GDPR, implemented in 2018, was a landmark regulation that set out clear standards for data privacy and user consent. Key GDPR provisions like consent requirements, the right to be informed, and the right to withdraw consent directly address some of the more harmful aspects of dark patterns. Under GDPR:

  • Informed Consent: Businesses must ensure that consent is given freely, explicitly, and informed, without any form of manipulation.
  • Right to Withdraw Consent: Users should be able to withdraw consent as easily as they give it, without being subjected to unnecessary hurdles, a key tactic in dark patterns known as the “roach motel.”
  • Fairness: the overarching principle of the GDPR is that any processing of personal data should be done fairly. Thus, any dark pattern that pushes users to share more persoanl data than they intended are a clear breach of GDPR.
  • Transparency: GDPR mandates transparency about data collection and processing practices, which conflicts with the deceptive nature of many dark patterns.

Typically, art. 4, 5, 7, 12 and 25 of the GDPR would apply to dark patterns in the context of data privacy.

In addition, the EU’s Consumer Rights Directive, alongside other consumer laws, prohibits unfair commercial practices and deceptive advertising, which encompass dark patterns that coerce users into purchases or obscure the terms of service. In particular, the Unfair Commercial Practices Directive prohibits “misleading” and “aggressive” commercial practices and contain an Annex that blacklists no less than 31 practices, among which at least half apply to dark patterns.

The Digital Markets Act (DMA) and Digital Services Act (DSA) – A New Level of Accountability

The DMA and DSA, both introduced recently, represent a pivotal shift in the EU’s digital strategy. These acts specifically target large online platforms and gatekeepers to create a more competitive and user-centric digital market. Importantly, they explicitly prohibit dark patterns, making the EU one of the first regions to directly ban these manipulative tactics by name.

The Digital Markets Act (DMA)

The DMA, effective from November 2022, focuses on fair competition and limits the power of digital gatekeepers, which are large online platforms that can control access to markets and exert significant influence over users. While the DMA primarily addresses anti-competitive practices, it also tackles dark patterns that gatekeepers may employ to maintain their dominance. For instance:

  • No Tricking Users Into Using Proprietary Services: Gatekeepers cannot use dark patterns to make it difficult for users to switch to competing services or to lock them into their ecosystem.
  • No circumvention: more generally, gatekeepers are prohibited from using dark pattern (”behavioral techniques or interface design”) to circumvent any of their obligations under the DMA.
  • Increased Transparency: Platforms must be transparent about their data collection practices and provide clear, accessible information to users.

By directly addressing these concerns, the DMA protects users from manipulation and enhances consumer choice, allowing for a healthier, competitive digital market.

The Digital Services Act (DSA)

The DSA, which took effect in 2023, is a comprehensive framework that aims to regulate online content, enhance user safety, and safeguard fundamental rights. While the DSA covers a wide range of online issues, one of its central goals is to curb the use of dark patterns by obligating online platforms to design fair, transparent interfaces.

Under the DSA:

  • Ban on Deceptive Design: For the first time, dark patterns are explicitly prohibited by law. Art. 25 of the DSA expressly prohibits designing, organizing and operating online interfaces to deceive/manipulate or materially alter/impair users' ability to make free, informed decisions. Platforms are required to avoid deceptive tactics in their user interfaces, especially when it comes to obtaining consent and engaging in commercial practices.
  • Transparency in Terms of Service: Platforms must provide clear, understandable terms of service, helping users make informed choices without being manipulated.
  • Enhanced User Control: Platforms must give users more control over content algorithms and the ability to customize settings without facing dark patterns.

The DSA’s proactive stance on dark patterns marks a significant advancement in digital user protection, as it offers a clear mandate against coercive design practices. Major platforms failing to comply with these new standards face substantial fines, potentially reaching up to 6% of global revenue.

The AI Act and Its Prohibition of Dark Patterns

The EU’s AI Act further underscores the region’s commitment to digital transparency and fairness. The AI Act regulates artificial intelligence applications across different industries, with a strong focus on protecting user rights and preventing abuses of AI technology. As part of its provisions, the AI Act specifically prohibits dark patterns within AI-driven interfaces, where user data might be used for AI training or personalized content generation.

AI systems are prohibited from using techniques that:

  1. Act below a person’s level of conscious awareness (e.g., subliminal methods) or are intentionally manipulative or deceptive.
  2. Have the purpose or effect of significantly distorting a person's ability to make a well-informed decision.
  3. Result in the person taking an action or making a choice that they otherwise wouldn’t have made.

For the prohibition to apply, the AI system's influence must be serious enough that it causes or is likely to cause significant harm to the person making the decision, to another individual, or to a group of people.

In simpler terms, the AI Act restricts any AI system that operates subtly or deceptively in ways that impair people's judgment and lead them to take actions they wouldn’t normally choose—especially if this could lead to serious harm.

Why a Ban on Dark Patterns in AI Is Critical

AI-driven interfaces often leverage vast amounts of personal data and can be highly personalized, creating new opportunities for dark patterns to thrive. For example, a platform powered by AI could personalize prompts or interface elements to nudge users into making certain choices, such as signing up for premium services or providing more data.

With the AI Act:

  • Prohibition on Manipulative Design in AI Applications, for example: AI applications, particularly those with high user engagement, are restricted from employing dark patterns that could subtly manipulate users and alter their agency.
  • Focus on Ethical AI Development: The act prioritizes transparency, requiring that users are informed when they are interacting with AI and understand how their data is used.

The AI Act’s prohibition on dark patterns in AI applications emphasizes the EU’s dedication to maintaining ethical standards in this rapidly advancing technology field.

Legal Actions Targeting Major Platforms for Dark Patterns Under the DSA

The EU has already begun taking enforcement action under the DSA, including lawsuits against several high-profile digital platforms. Major players like X (formerly Twitter), Meta, Shein, Temu, TikTok, and AliExpress are under scrutiny for allegedly breaching DSA requirements, including accusations of using dark patterns. Let’s take a closer look at these cases:

X (formerly Twitter)

On the 12th of July 2024, the European Commission notified X of its preliminary finding that the company is in violation of the Digital Services Act (DSA) in several critical areas, specifically related to dark patterns, advertising transparency, and data access for researchers.

Key Points of Non-Compliance according to the Commission:

  1. Misleading Verified Accounts:
    • X's interface for "verified accounts" with the "Blue checkmark" misleads users.
      • These blue check marks are supposed to certify trustworthy sources of information. But “anyone can subscribe to obtain such a ‘verified’ status’ which undermines users' ability to trust the authenticity of accounts and content.
      • The Commission argues that malicious actors exploit this feature.
  2. Lack of Advertising Transparency:
    • X fails to maintain a searchable and reliable advertisement repository. Design features and access barriers prevent the repository from serving its transparency purpose, hindering supervision and research into online advertising risks.
  3. Restricted Data Access for Researchers:
    • X prohibits eligible researchers from independently accessing public data, such as through scraping, and imposes disproportionate fees for API access. This discourages researchers from conducting essential studies.

If the Commission's findings are confirmed, a non-compliance decision could be issued, citing breaches of Articles 25, 39, and 40(12) of the DSA. X could face fines up to 6% of its total worldwide annual turnover.

Meta

Meta, which owns Facebook, Instagram, and WhatsApp, has a long history of employing complex interfaces that often encourage users to overshare information. On the 16th of May 2024, the European Commission has launched formal proceedings to investigate whether META, the parent company of Facebook and Instagram, has violated the Digital Services Act (DSA) concerning the protection of minors.

Key concerns include:

  1. Behavioral Addictions and Rabbit-Hole Effects: The potential for Meta's systems and algorithms to foster addictive behaviors in children and create "rabbit-hole" effects.
  2. Age-Verification Tools: The adequacy and effectiveness of Meta's age-assurance and verification methods to prevent minors from accessing inappropriate content.
  3. Privacy, Safety, and Security Measures: Whether Meta has implemented proper measures to ensure the privacy, safety, and security of minors, especially regarding default privacy settings and recommender systems.

The investigation will assess Meta's compliance with Articles 28, 34, and 35 of the DSA. The Commission will gather further evidence and may take enforcement actions.

Shein and Temu

Fast-fashion giants Shein and Temu have been accused of using dark patterns to influence users' purchase decisions and data sharing. Following the complaint submitted to the Commission by consumer organisations in May 2024, the European Commission issued formal requests for information to online marketplaces Temu and Shein under the Digital Services Act (DSA) on the 28th of June 2024. The Commission seeks detailed information on their compliance with DSA obligations, particularly regarding :

  • the "Notice and Action mechanism" (which allows users to report illegal products)
  • the design of online interfaces (ensuring they do not deceive or manipulate users through "dark patterns")
  • the protection of minors,
  • the transparency of recommender systems,
  • trader traceability,
  • and overall compliance by design.

TikTok

The popular social media app TikTok has faced allegations of using dark patterns to retain user engagement, such as implementing design choices that encourage continuous scrolling and data sharing. With its young user base, TikTok is under pressure to comply with DSA standards and provide a fair, transparent experience that respects user autonomy.

On April 22, 2024, the European Commission initiated formal proceedings against TikTok due to the platform's failure to provide a risk assessment report for the launch of TikTok Lite. As a result, the Commission warned TikTok about the potential suspension of its TikTok Lite Rewards programme in the European Union. Two days later, on April 24, TikTok voluntarily suspended the Rewards programme across the EU.

On the 5th of August 2024, the Commission has made TikTok's commitments to permanently withdraw the TikTok Lite Rewards programme from the EU legally binding.

The commitments include:

  1. A commitment to permanently withdraw the TikTok Lite Rewards programme from the EU.
  2. A commitment not to launch any other programme that would circumvent this withdrawal.

With today's decision, these commitments are now legally binding. Any breach would constitute a violation of the DSA and could result in fines.

Consequently, the Commission has closed the formal proceedings against TikTok.

AliExpress

AliExpress, the online retail platform, is also facing scrutiny for allegedly employing dark patterns in its checkout process, with hidden fees, complex return policies, and opaque data-sharing practices. If found to be in violation of DSA requirements, AliExpress could face significant fines and be required to redesign its user interface.

AliExpress is among the Very Large Online Platforms being investigated by the Commission under the DSA, and the first online marketplace to face a formal investigation. The Commission says it suspects the marketplace of breaching DSA rules in areas linked to the management and mitigation of risks; content moderation and its internal complaint handling mechanism; the transparency of advertising and recommender systems; and the traceability of traders and to data access for researchers.

The Commission will also look into transparency and safety concerns related to influencers’ use of AliExpress. The platform offers an affiliate program aimed at social media influencers who can earn a commission through links to goods being sold on the platform. The Commission said it suspects some of this activity is leading to the sale of non-compliant — and potentially dangerous or otherwise risky — products.

What These Changes Mean for Businesses Operating in the EU

For businesses operating in the EU, the prohibitions on dark patterns under the DMA, DSA, and AI Act mean that compliance is no longer optional. Fines incurred under the DSA alone can amount up to 6% of the global turnover of the groups, and fines can reach up to 7% under the AI Act. The EU’s strong stance on these issues underscores the need for companies to adopt ethical, transparent design practices that prioritize user welfare. Here are some key takeaways for businesses:

  1. Audit and Revamp User Interfaces: Companies should audit their interfaces to identify potential dark patterns and redesign them for transparency and user-centered navigation.
  2. Prioritize Clear Consent Mechanisms: Consent mechanisms must be straightforward and unambiguous, especially regarding data collection and tracking.
  3. Enhance User Control: Businesses should provide users with greater control over personalization settings, cookie preferences, and data-sharing options.
  4. Prepare for Rigorous Enforcement: Given the EU’s proactive stance, companies should be ready to address regulatory inquiries and demonstrate compliance with DSA and DMA requirements.

Final Thoughts: EU’s Strong Stance on Dark Patterns Ushers in a New Digital Standard

The EU’s decisive action against dark patterns sets a powerful precedent for digital fairness and transparency worldwide. With the DMA, DSA, and AI Act explicitly prohibiting these manipulative design practices, online platforms and businesses are now accountable for creating ethical user experiences that prioritize transparency and respect user autonomy.

For consumers, these regulations promise a safer, more transparent digital environment where they can engage freely without fear of manipulation. For businesses, this shift underscores the importance of designing user interfaces that foster trust, build customer loyalty, and adhere to ethical standards.

As we move into a new era of digital accountability, companies will benefit from aligning with these regulations, as fair design is not just a legal requirement but also a core tenet of good business. With the EU leading the charge, a digital ecosystem that respects user choice and promotes fairness is closer than ever.

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name