Dark patterns deceive or manipulate users, making them behave without realizing, or against their own interests. 

We are an award-winning legal innovation studio,
fighting against dark patterns since 2018. We believe each individual should be able to make their own, informed and free choices online.

So we created Logo Fairpatterns

WHAT ARE DARK PATTERNS?
This term was first coined in 2010 by Harry Brignull, UX designer and PhD in cognitive sciences, referring to “tricks used in websites and apps that make you do things that you didn’t mean to”.

Does this look familiar?
Here’s everything you need to know about dark patterns

We believe
the only person who should be allowed
to make choices for you
is you.

If you believe
you’re not one
to fall into these traps,
think again

Dark patterns and cognitive biases:
What you need to know

Our brain is a real decision machine from the simplest to the most sophisticated. Nobel Prize in economics Daniel Kahneman identified 2 main systems through which our brain makes decisions: “System 1”, fast, intuitive and with low cognitive efforts, and “System 2”, slower, with analytical strategies but high cognitive effort. System 1 is based on heuristics and cognitive biases, which are like mental shortcuts to make decisions.

Cognitive biases are implicit, systematic, irrational patterns impacting our judgement and the way we make decisions. Cognitive biases can lead to more effective actions or faster decisions in some contexts. They reduce the amount of information to be considered and simplify the processes of judgment and decision-making.

Overall, there are 180 biases described to date, categorized in 4 main types:

  • Information overload: biases that arise from too much information
  • Lack of meaning: biases that arise out of the brain’s need to make sense
  • Need to act fast: biases that arise to gain time and resources
  • Figuring out what needs to be remembered for later: biases that arise in relation to memory limits.

The trouble is that because they’re a predictable pattern for decision-making, they can be exploited to benefit business, at the expense of users. That’s precisely the case for dark patterns: they exploit some of our cognitive biases to make us act in a certain, predictable way – sometimes against our own interest.

Sounds like conspiracy theory? Unfortunately, there is solid scientific evidence that dark patterns exploit our biases.

Here are all the reasons
why you should care

Don’t know where to start?
Take a look at our resources!

For instance, join the community at this event:

Our new posdcast about Dark Patterns

Fighting dark patterns - regain your free will online

We want to empower all users and stakeholders to fight against dark patterns. Raising awareness is key, that’s why we’re creating a dedicated podcast!

The first episode was released on April 12, with our experts and guest stars, covering the following topics:

What are dark patterns?
How to spot a dark pattern?
Why are dark patterns efficient? I.e. How our brain works, System 1 and 2, cognitive biases
What can you do to avoid dark patterns?

Happy listening !

A few words about us

We are an award-winning legal innovation studio, fighting against dark patterns since 2018.

Our unique portfolio of over 80 legal design projects around the world and in-depth research in our R&D Lab enabled us to create “fair patterns”: interfaces that empower users to make enlightened and free choices.

Frequently asked questions

In the US, the California Consumer Privacy Act (CCPA) was amended in 2021 to ban dark patterns that “subvert or impair a consumer’s choice to opt out of schemes where their personal data is sold”. In addition, the FTC released a staff report in 2022 clearly identifying 4 types of dark patterns: design elements that induce false beliefs, design elements that hide or delay disclosure of material information, design elements that lead to unauthorized charges, and design elements that obscure or subvert privacy choices.
In the EU, according to Article 25 (1) of the DigitalServices Act, which applies to online platforms, a dark pattern is a digital interface that “misleads, manipulates, impairs or substantially limits the ability of users to make free and informed decisions”, due to its “design, organisation or the way it is operated”. Now, in addition to express prohibitions, a large number of laws also apply to dark patterns as unfair commercial practices under consumer law, practices violating GDPR or other privacy protection laws, or other sector-specific laws.

Several academic studies have evidenced that our ability to identify dark patterns is limited, precisely because they exploit our cognitive biases, i.e. make us act without realizing. For example, and by definition, it is very difficult to identify default settings and even if we check them, the “status quo” bias will prompt us to leave it as it is. Adding to that intrinsic complexity, some scholars have evidenced that even when we spot a dark pattern, for example one that make us feel ashamed of ourselves or feel stupid if we don’t “accept” an offer to buy or to share our personal data, it is difficult to resist them. We believe this is a totally unacceptable distortion of digital economy, which should provide the best outcomes for users and consumers. That’s why we created this platform, which aims at raising awareness among all stakeholders and users at large, and provide means for action. For example, we created generic versions of dark patterns because it is a more efficient way to raise awareness, than to show specific, branded examples – as one could think that the problem lies with a give company, and might not spot a similar dark pattern on another site or app. Secondly, we created generic versions of fair patterns, based on a number of academic articles showing that users modify their behavior when they’re given a fair chance to do so. Thirdly, we created training specifically to spot dark patterns and avoid creating any, for designers, developers, digital marketers, and anyone interested. We’re also creating podcasts on dark patterns, to raise awareness as widely as possible – coming very soon!

Dark patterns create individual and structural harms, as well as legal risks for the businesses using them.
On the individual level, dark patterns (i) harm users’ autonomy, by tricking them into behaving in a certain way without realizing, or against their own interests, (ii) dark patterns harm users’ welfare, by making them buy more than intended, or at a higher price than intended, (iii) dark patterns significantly harm users’ privacy, by tricking them to share more personal data than intended, or making protecting their privacy more difficult, (iv) they create psychological detriment and time loss, as evidenced by a behavioral study of the European Commission in 2022.
On a structural level, dark patterns also create risks to digital economy and ultimately to our democracies: first, and very logically, dark patterns destroy users’ trust in the brands that use them. For example, a study showed that 50% of users exposed to scarcity and social proof on hotel booking websites expressed distrust towards such sites. Second, dark patterns alter competition by enabling firms to gain a significant competitive advantage (e.g. more personal data collected, or a turnover boost) without improving the quality of their services or products. Ultimately, the OECD is concerned that firms would end up competing through the “efficiency” of their dark patterns, instead of competing on quality or services, thus ceasing to produce the best outcomes for consumers. This means that at the end of the day, dark patterns could simply put in question whether market economy is still the model that produces the best outcomes. 
Given that dark patterns alter our autonomy and affect our behavior, we also wonder how they could endanger democratic systems.

First, there are numerous existing legal grounds around the world that prohibit dark patterns, as unfair commercial practices, breaches of GDPR or equivalent data protection legislation outside of the EU, or abuses of dominance under competition law. Perhaps the efficiency of these existing legal frameworks can be called into question, given the high prevalence of dark patterns in the world. Now, legislators around the world have understood the seriousness of dark patterns and since 2020 have been passing specific acts expressly prohibiting dark patterns as such: California is considered to be the first state to expressly prohibit dark patterns in the California Privacy Rights Act (CPRA), as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making or choice”. Then, the California Consumer Protection Act was amended in 2021 to also ban dark patterns, the FTC released a staff report in 2022 on dark patterns, clearly signaling enhanced enforcement actions. In the EU, the Digital Services Act which is applicable to all online platforms expressly prohibits dark patterns as an interface that “mislead, manipulates, impairs or substantially limit the ability of users to make free and informed decisions”, due to its “design organization or the way it is operated”. Have a look at our data visualization of dark patterns legislation around the world juste here.

If you see a dark pattern, it’s very important to take screen shots and to take action: at a minimum, we encourage you to name and shame the company at stake, showing the screenshots, on social networks, using #darkpatterns, you can also send the information to halls of shame such as the one run by Harry Brignull (https://www.deceptive.design/about-us) or the Dark Patterns Tipline run by a multidisciplinary team at Standford: https://darkpatternstipline.org/report.
We also encourage you to report dark patterns to regulators, such as the data protection authority of the country where you are based, if the dark pattern is related to privacy, your national consumer protection authority if you’re a consumer and your national competition authority if you are a business.
You might also want to think about class actions in your jurisdiction of you believe that some given dark patterns affect the same type of consumers in a similar way.
We’re building a specific page to easily identify regulators, halls of shame and other NGO’s so as to empower all users to take action – coming soon!

No client or employer can force you to create something illicit. Most of the time, in our experience, few companies actually realize (i) that they’re talking about a dark pattern and (ii) that it is against many laws. So the first step is to create awareness: explain in simple terms what a dark pattern is, i.e. an interface that deceives or manipulates users to make them behave without realizing or against their interest. The second step we recommend is to explain the harms they cause, starting by the one that will resonate the most with companies: they destroy users’ trust in the brand. A study showed that 50% of users expressed disgust and distrust after having been exposed to dark patterns on hotel bookings sites. In addition, dark patterns are a very short-term strategy: lack of users’ trust will over time diminish the value pf the brand, and the potential turnover boost achieved by say, roach motel (subscriptions very easy to get in, but impossible or very difficult to cancel) will inevitably cause (i) consumer outrage in your hotline, (ii) eventually unsubscription when they realize they’ve been tricked i.e. loos of turnover and (iii) lower value of the company because of users’ distrust and risks of fines.
Last but certainly not least, companies using dark patterns can be fined 4% of their GLOBAL turnover under the GDPR, 6% under the Digital Services Act, and 10% under Article 102 EU Treaty, prohibiting abuse of dominance, depending on which ground is applicable. In the US, Epic Games had to pay a record $520 million to settle a lawsuit for having used, on their most famous game Fortnite, dark patterns to trick their users, including teenagers, to make unintended purchases and privacy-invasive default settings! In addition to a penalty of $275 million, Epic Game has to pay $245 million in the form of refunds to its customers, the largest refund ever ordered in history in the video game industry. Obviously, Epic Games was also ordered to suppress all dark patterns from its sites. Food for thought.

Harry Brignull and some scholars such as Dr. Jennifer King started to use the term “deceptive pattern” to avoid any misunderstanding as to any negative connotation associated with the term “dark”. We fully adhere to this precaution. Given that legislators have started to enact legislation referring to “dark patterns”, we continue to use this term for legal precision purposes.

Advertising and digital marketing seek to convince consumers to buy a given service or product, and to influence consumers. There are clear limits to such influence: advertising needs to be clearly identified as such, advertising and digital marketing cannot be misleading. When looking at an add, users still have their autonomy to decide whether to buy the product or not. Dark patterns go way beyond influencing users: they deceit, manipulate, coerce or exploit to make users behave without realizing it, against their preferences or against their interests. There are several scientific and definitions of dark patterns, but they all rely on a fundamental trait: dark patterns modify the choice architecture presented to users, and necessarily alter users’ autonomy, i.e. their ability to make decisions based on their own preferences and system of values. This has nothing to do with ads or marketing, where users still have the ability to decide whether they want to buy or not. If anyone tells you it’s indispensable to use dark patterns to market your product or services, that’s (i) untrue, (ii) evidence of their professional inability or lack of skills and (iii) illicit. Respecting your customers implies competing on the intrinsic qualities of your products, or level of service, not manipulating them.

You have another question?
Contact us!

Our clients