What is a “dark pattern” or deceptive design?
This term was coined in 2010 by Harry Brignull, UX designer and PhD in cognitive sciences, to describe “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something”. Importantly, some researchers including Brignull and Dr. Jennifer King, now propose to use the term “deceptive patterns” to avoid any misunderstanding as to any negative connotation associated with the term “dark”. We fully adhere to this precaution. To the extent that legislators have been using the term “dark pattern”, we’ll keep using it for legal precision purposes.
Since 2010, there has been extensive academic research on defining these manipulative and deceptive interfaces, creating no less than 16 different taxonomies, identifying the applicable legal grounds, assessing their prevalence and measuring the harms they cause both to individuals and to competition and market economy.
Regulators around the world have also been very active in the past few years in producing reports and enforcement guidelines: in 2022 alone, the European Commission published an extensive behavioral study on dark patterns, namely evidencing that 97% of European favorite e-commerce sites contain at least 1 dark pattern. The FTC published a Staff Report: Bringing Dark Patterns to Light, clearly signaling to the market strong enforcement, while the OECD compiled over a decade of existing research in a synthesis report on dark commercial patterns, highlighting the structural risks caused by dark patterns to competition and digital market economy itself.
What are deceptive patterns in practice? If you’ve been on networks or bought something today, we bet you’ve come across several dark patterns already. The big fat green “Accept All” button on cookie banners on which you clicked without even thinking, the “free trial” which turned out to be impossible to cancel and very much paid-for, the prompt to activate geolocation “to find your friends”, the countdown or low stock message, the 5 clicks required to attempt adjusting your privacy settings, only to end on a wall of jargon…which made you give up.
Dark Patterns are a digital plague.
As explained by Arunesh Mathur et al. deceptive design “modifies the choice architecture presented to users”. They do so either by modifying the set of choices available to users by making some choices more burdensome or less appealing than other choices (usually the most privacy protective or the cheapest for users), or manipulating the information flow to users by making protective information more difficult to find, more difficult to understand, or even providing false information.
How do dark patterns work?
Basically, our brain is a real “decision machine”, from the simplest to the most sophisticated. Daniel Kahneman, Nobel Prize in economics, identified 2 main “systems” through which our brain makes decisions: “System 1”, which is fast and intuitive with low cognitive efforts, and “System 2”, which is slower and has more analytical strategies but high cognitive effort. System 1 is based on heuristics, meaning the brain finds an acceptable solution but not the best solution, and “cognitive biases”, which are like “mental shortcuts” to make decisions. In our example, the deceptive design led users to make decisions with “System 1”.Cognitive biases are systematic, irrational patterns impacting our judgment and the way we make decisions. They can lead to more effective actions or faster decisions in some contexts, but they also reduce the amount of information that is considered and simplify the processes of judgment and decision-making.
Overall, there are 180 biases described to date, categorized into 4 main types:
- Information overload: biases that arise from too much information;
- Lack of meaning: brain needs to make sense;
- Need to act fast: gain time and resources; and
- Figuring out what needs to be remembered for later: the biases relating to memory limits.
The trouble is that because cognitive biases a predictable pattern for decision-making, they can be manipulated. That’s precisely the case for deceptive design: they manipulate some of our cognitive biases to make us act in a specific, predictable way – sometimes against our own interest.
How does this concern you?
First, you should know that dark patterns are everywhere. Significant research efforts have been made to identify their prevalence, both specifically in privacy, in consumer matters and beyond. Among the many resources, here are some of the most relevant:
- In 2018, the Norwegian Consumer Protection Council analyzed the account creation and management settings of Facebook, Google, and Windows 10 and identified dozens of “privacy intrusive” interfaces that lead users to share more personal data than they would have consciously shared otherwise.
- In 2022, a European Commission study on dark patterns found that 97% of 75 popular e-commerce websites and apps in the EU contained at least one dark pattern.
- The Chilean consumer protection authority found that 64% of 103 Chilean e-commerce websites that were examined featured at least one dark pattern.
- Di Geronimo et al. found that 95% of a sample of 240 popular apps contained at least one dark pattern.
- Gunawan et al. found that all 105 of the most popular online services in the Google Play Store that featured both an app and website format contained at least one dark pattern.
- Even more astounding, Radesky et al. found that 80% of popular children’s apps contained at least one manipulative design feature.
Second, we know that deceptive design generates two main types of harms, individual harm and structural harm.
Individual harms: Unsurprisingly, there is evidence showing that dark patterns harm an individual’s autonomy by tricking users into deciding against their preferences, denying their choice, or making their choice more difficult. In addition, researchers and regulators have established that dark patterns:
- Harm your welfare by making you lose money: they make you buy more than intended or pay more for what you wanted to buy. Did you know that “drip pricing” was evidenced to make users spend 21% more than usual?;
- Significantly harm your privacy: they lead you to share more personal data than intended, for example a survey on Australian consumers showed than 1 in 4 consumers shared more personal data specifically because of dark patterns; and
- Create psychological detriment and time loss: they generate frustration, shame, and increase our cognitive burden. The European Commission even found that they increase the heart rate, cause anxiety, and alertness.
Structural harms: Did you know that almost 50% of users exposed to scarcity and social proof dark patterns on hotel booking sites distrust these sites? Researchers showed that these deceptive patterns decrease consumer trust in the brands using them.
Dark patterns have also been found to alter competition:
- By preventing consumers from comparing prices, locking consumers into recurring subscriptions, and hampering switching;
- By enabling firms to increase their sales or the volume of personal data they collect without offering better services or products in return, giving them an undue competitive advantage; and
- Dominant firms can use deceptive design to neutralize a competitive threat or extend their dominance in a neighboring market.
By impeding a consumer’s ability to select the best companies on the merits of their offerings, or to switch to competition because they are trapped in subscriptions they cannot cancel e.g. Amazon Prime, against which the Federal Trade Commission just launched a lawsuit, dark patterns can not only hurt consumers but also distort the competitive process. And if they can alter our commercial and social behavior, we must also wonder what effects dark patterns could have on our democratic systems.
Quite logically, significant fines and enforcement actions are already enacted or are coming. Several existing laws already apply to dark patterns: consumer laws prohibiting unfair commercial practices such as Section 5 of the Federal Trade Commission Act or the Unfair Commercial Practices Directive, as well as data protection regulations like GDPR imposing principles of fairness, transparency, data collection minimization, and informed consent, and competition law prohibiting abuse of dominance…
But dark patterns are everywhere and cause severe harms, so regulators around the globe are adopting specific laws to expressly prohibit them:
- For all business: the California Consumer Privacy Act (CCPA) was amended in 2021 to ban dark patterns that prevent or make it more difficult to change your mind and refuse your personal data being sold.
- Online platforms: the Digital Services Act (DSA) expressly forbids designing or operating platforms in a way that prevents users from making informed and free choices. Did you know that violations of the DSA could be fined up to 6% of the global turnover of the group concerned?
- Gatekeepers: The Digital Markets Act prohibits gatekeepers from using dark patterns to circumvent their obligations.
What are the penalties incurred because of dark patterns? For example, Epic Games recently settled a complaint before the FTC in a total amount of $520 million for dark patterns charging players for unwanted in-game purchases. The FTC’s signal to the market is clear: get rid of any dark patterns, quickly. In the EU, the risks are 4% of the worldwide turnover under the GDPR, 6% of the worldwide turnover under the DSA (which is only applicable if the GDPR or the Unfair Commercial Practices Directive are not), and 10% under competition law (abuse of dominance, when applicable).
At national level, very recently, The Italian Data Protection Authority fined an Italian company €300,000 for its use of dark patterns. This sanction is equivalent to 2% of their turnover, which is one of the highest percentages considered by the authority when calculating sanctions so far.
Are there any alternatives to dark patterns?
Fair Patterns has been pioneering ethical design, fighting against blind signing since 2018, creating for example the first inclusive and sustainable privacy notice, which was recognized as Most Innovative Privacy Project by IAPP in 2022. We created our R&D specifically to advance research against deceptive design and find remediation measures. We analyzed the 16 existing taxonomies, and identified 4 main waves of taxonomies: the first to describe and define the problem, the second to assess its prevalence, the third to identify the applicable legal grounds and the fourth to assess the types and severity of harms.
A truly impressive amount of work in academia, but we noticed that all these taxonomies are problem-oriented. Considering this gap, we proposed a solution-oriented taxonomy, in which each dark patterns has its countermeasure as a fair pattern: interfaces that empower users to make their own, informed and free choices. Our taxonomy also aims at being:
- Easy to understand: our categories are easily memorized and used by designers, developers, marketers, lawyers, regulators…
- Easy to act upon: we created fair patterns to combat each dark pattern.
- Future-proof: they cover the dark patterns that will be invented, thanks to rooting fair patterns in cognitive biases on which dark patterns could play one way or another.
We’re happy to share that one of our scientific articles was selected by the Annual Privacy Forum in June 2023: From Dark to Fair Patterns: a usable taxonomy to contribute solving the issue with counter measures.
At this point, you might be wondering about the benefits of using fair patterns. Research shows that a third of users exposed to dark patterns, like scarcity and social proof claims, on hotel booking sites expressed contempt and disgust, and nearly 50% distrusted the sites as a result. Clearly, a business strategy based on deceptive design is doomed to fail, simply because users will lose trust in these businesses and go elsewhere – even without mentioning the risks of fines and bad reputation. Researchers have also shown that any short-term gains an online business gets from dark patterns are likely to be lost in the long term.
Transparency and fairness by design, on the other hand, have been shown to foster trust and long-term customer loyalty. Let’s end mass deception and manipulation online. Transparency and fairness is the new black!