A collection of research and articles about dark patterns, user behaviour and law
Elevate your business with 10x more data, advanced filters, and exclusive insights.
The widespread use of manipulative designs, or dark patterns, in everyday applications and their impact on users is raising concerns among policymakers and scholars. These designs employ techniques that nudge users into making decisions they might not choose if fully informed, causing various types of harm. The integration of these mechanisms with other platform features makes it difficult for users to recognize the manipulation. Understanding the effects of manipulative designs is crucial for developing protective countermeasures, but researchers face significant methodological challenges. Investigating the impact of manipulative designs is complicated by the fact that users often do not perceive the manipulation. This paper reflects on these challenges through three case studies, highlighting key issues and providing methodological insights for the empirical study of manipulative designs.
Nowadays, websites commonly use two concepts to influence user behavior: deceptive patterns and nudges. In the literature, these concepts are distinguished by their goals and effects—deceptive patterns manipulate users, while nudges encourage better decision-making. However, from a technical perspective, it is unclear if they differ in their implementation. This paper presents a methodology developed to determine whether it is possible to automatically differentiate between deceptive patterns and nudges when crawling a web page. Our findings suggest that there is no need to distinguish between the two concepts, as they are implemented using the same techniques.
Navigating the web has become increasingly difficult for users due to manipulative UI design patterns known as "dark patterns," which lead users to act against their best interests. These tactics are prevalent, yet users remain largely unaware of them. Existing detection methods, including machine learning algorithms, struggle to generalize across all dark patterns due to their varied definitions and implementations. This paper proposes crowdsourcing as a solution to detect and flag dark patterns. Crowdsourcing leverages users' collective experiences to identify these manipulative designs more effectively. The authors introduce Neighborhood Watch, a Chrome extension that allows users to tag dark patterns on websites and view tags submitted by others. This system promotes more conscientious browsing and reduces susceptibility to dark patterns. Despite some limitations, the study concludes that crowdsourcing can effectively protect users from manipulative interfaces.
Dark patterns in user interfaces have attracted global attention from various disciplines. This study highlights a meta-level issue: demographic biases in DP research. It examines the origins of published research and participant demographics, revealing a bias favoring English-speaking North America and Europe. Addressing these biases is crucial for ensuring inclusivity and rigor in the field.
Commercial health apps have become more accessible and popular, serving purposes such as enhancing health literacy, enabling continuous health tracking, and facilitating community engagement. However, concerns have arisen about the privacy, commodification, and exploitation of data generated by these apps. Less is known about deceptive design patterns and coercive practices from the users' perspective. This study uses pregnancy tracking apps as a case study and presents preliminary findings on user experiences. We argue that health apps require a nuanced consideration of deceptive design practices because (1) these patterns can uniquely intersect with users' vulnerabilities in the health context, and (2) the implications can extend beyond financial losses and privacy invasion, impacting users' health and well-being.
Privacy dark patterns are design tactics used by online services to reduce users' online privacy. These patterns either facilitate institutional data collection or increase others' access to personal data. This study examines how social networking sites popular with teens—Snapchat, TikTok, Instagram, Twitter, and Discord—use these tactics to steer users into reducing their social privacy. We analyzed recordings of account registrations, settings configurations, and logins/logouts for each SNS. Our content analysis identified two major dark pattern types—Obstruction and Obfuscation—and seven subtypes. We discuss why social media companies promote social sharing through design and the challenges of regulating these privacy dark patterns.
The use of persuasive designs to influence user behavior is now ubiquitous in digital contexts, giving rise to ethically questionable practices known as 'dark patterns.' While various taxonomies of dark patterns exist, there is a lack of frameworks that address how these designs are embedded not only in user interfaces but also in the functionality and strategy of digital systems. This paper proposes a framework for a Layered Analysis of Persuasive Designs, grounded in Garrett’s five-layer model of user experience (UX) design and Fogg’s Behavior Design Model. The framework identifies a toolkit of 48 design elements that can be used to operationalize problematic persuasion in digital contexts, highlighting the autonomy impact of each element. This framework aims to assist designers and policymakers in identifying and evaluating (potential) dark patterns within digital systems from an autonomy perspective.
The issue of Dark Patterns, or "Deceptive Design," is gaining recognition in literature. However, their widespread presence across various domains complicates interdisciplinary communication and collaboration. Existing taxonomies of these patterns often overlap and address them at different levels of abstraction, hindering cross-domain discourse. This is problematic given the growing evidence of the adverse effects of such designs on users. Additionally, the fine line between manipulative dark patterns and intuitive, protective, and defensive interface designs further complicates the issue. Current taxonomies primarily define patterns but struggle to distinguish between manipulative and benevolent implementations in specific contexts. This work proposes a method to differentiate between these applications by analyzing previously identified patterns for their properties, consequences, and contexts of application. This paper presents our progress toward creating a taxonomy-independent evaluation process for identifying and describing Dark Patterns.
Regulatory responses to dark patterns often depend on expert evaluations of design interfaces to determine if users are being manipulated or deceived. This article unpacks expert assessments of dark patterns used to solicit user consent and argues that regulatory actions should explicitly address whose expertise is being consulted. It concludes by discussing the value of deliberative mechanisms in broadening the range of both experts and expertise modes for identifying, evaluating, and regulating dark patterns.
Dark patterns refer to design practices that trick or manipulate users into making certain choices. One in four internet users encounter dark patterns. This paper examines vital guidelines issued by government commissions or authorities worldwide, including those from the United States, South Korea, India, the European Union, California, Australia, the United Kingdom, Kenya, and Argentina. A comparative analysis of these guidelines highlights national standards, types of dark patterns, and adherence norms. The study reveals minimal enforcement efforts by the relevant authorities to counter dark patterns. It advocates for a global collaboration to establish universal guidelines against dark patterns, overseen by an international authority or commission.
As of January 2024, 5.35 billion people, or 66.2 percent of the world's population, are online. With attention spans reduced to 8 seconds, digital businesses struggle to acquire, engage, and retain users. Many companies use dark patterns—deceptive UI elements that trick users into actions like signing up for services or making purchases. This thesis investigates various dark patterns, such as roach motel, malicious nudging, urgency/scarcity, bait and switch, and confirm-shaming, categorized into “pressure” and “trickery” tactics. While these methods help companies meet business goals, they undermine user trust. To combat this, the thesis proposes developing a Chrome extension to detect and highlight scarcity and urgency dark patterns. This tool aims to raise user awareness and promote a more transparent internet experience.
Internet users are constantly bombarded with digital nudges like defaults, friction, and reinforcement. When these nudges lack transparency, are not optional, or do not benefit the user, they become 'dark patterns', categorized under the acronym FORCES (Frame, Obstruct, Ruse, Compel, Entangle, Seduce). Psychological principles like negativity bias, the curiosity gap, and fluency are exploited to make social content viral, while covert tactics such as astroturfing, meta-nudging, and inoculation are used to create false consensus. The power of these techniques is poised to grow with advances in predictive algorithms, generative AI, and virtual reality. Although digital nudges can be used altruistically to protect against manipulation, their effectiveness remains inconsistent.
As technology advances, the regulation of dark pattern practices has become crucial. These deceptive tactics manipulate users into unfavorable actions, like complicating service unsubscribes or highlighting consent buttons to obscure transparency. The European Union has responded with several legislative acts, including the GDPR, the Digital Markets Act, and the Digital Services Act. However, these regulations often overlap, creating ambiguities and redundancies. This article examines these challenges and proposes solutions, such as harmonization and centralization, to streamline the regulatory framework. The goal is to protect users from manipulative practices and ensure informed decision-making in the digital realm.
This cumulative thesis investigates the intentions behind digital interfaces, focusing on exploitative "dark patterns" that manipulate user behavior. While existing HCI research has identified various dark patterns, this work synthesizes findings to develop comprehensive frameworks and tools for better understanding and mitigating their effects. Through qualitative and quantitative studies, the thesis explores dark patterns in social networking sites (SNS), examines user perceptions and challenges in recognizing these patterns, and contributes to design theory by identifying where dark patterns manifest. The Responsible Design Triangle model is introduced, highlighting the interdependencies between design, user behavior, and guidelines, to promote ethical digital interface design.
This research investigates the dark patterns users encounter when subscribing to or unsubscribing from online services. While previous studies have described dark patterns in digital contexts, this study provides a detailed analysis of such patterns in online subscriptions. By examining ten case studies of sign-up and cancellation processes on streaming platforms and software services, the research identifies deceptive designs and asymmetric efforts through user flow data and visual artifacts. The findings highlight the prevalence and complexity of dark patterns, offering insights for stakeholders, future design standards, and policy recommendations to improve consumer protection in online subscriptions.
This paper examines the role of dark patterns within TikTok, a rapidly growing social media platform. Utilizing principles from behavioral economics and the existing literature on online choice architecture (OCA), the study investigates how TikTok employs dark patterns to engage users and explores the implications for data protection, algorithmic practices, and market dynamics. The paper uses the "walkthrough method" to conduct a case study, detailing the TikTok user experience and identifying potential dark patterns used by the app. It discusses the challenges in distinguishing dark patterns from legitimate commercial practices, especially considering the user impact. Additionally, the paper examines how dark patterns and OCA are addressed in the Latin American legal landscape and proposes next steps for the ongoing debate based on the study's findings.
Over the past two decades, the focus of design patterns has shifted from encouraging best practices to discouraging harmful ones. Dark and deceptive UX patterns that monetize engagement while perpetuating structural inequities are now prevalent. This study uses a visual case study of a childcare worker platform to critically examine these patterns. Through Care Layering, a form of critical documentation, the study highlights how UX patterns, when viewed as culturally-situated resources, reveal both limitations and opportunities in gig work platform engagement. The discussion emphasizes how Care Layering can help designers achieve greater accountability in UX design.
This research paper explores the widespread use of dark patterns in UI and UX design, uncovering the ethical implications of these manipulative practices. Dark patterns are deceptive design elements that influence user behavior for the benefit of designers or third parties. By examining various examples and their impact on user decision-making, the paper emphasizes the importance of recognizing and understanding these patterns to make informed choices in the digital landscape.
Current online contract practices often involve situations where parties do not understand their rights and obligations under these contracts. This article examines and discusses how complex online contracts complicate and sometimes impede people from making strategic, autonomous decisions. It also addresses how legal design approaches can shed light on complexity and foster tackling of dark patterns in online contracting so as to reduce transaction costs, increase legal quality, business sustainability, and competitive business advantage.
This summarises a roundtable on ongoing and emerging consumer risks associated with dark commercial patterns online organised as part of the 99th Session (Part 2) of the Committee on Consumer Policy (CCP) on 6 November 2020. It featured panellists from academia, consumer protection authorities, and a consumer organisation, the Norwegian Consumer Council. It begins with an overview of the main themes that emerged from the discussion, including examples and categories of dark commercial patterns and their defining attributes; evidence of their prevalence online; consumer vulnerability; and tools and approaches available to consumer protection authorities and policy makers to identify and mitigate them. It then provides details of the presentations by each of the panellists, before concluding with suggested next steps.
The white paper examines the ethical issues in popular apps (Android and iOS) in categories including education, gaming, communication, social and dating; used by adolescents. Additional categories of apps, including music and audio, entertainment, and movies & series, were also covered in subsequent parts of the study. Ethical issues regarding the apps were also researched on the other apps such as Twitter, Reddit, Quora, and Google. In this white paper, the author also examines ethical issues associated with four key sections, namely privacy, age- appropriateness, human-in-the-loop, and user interface. In conclusion, ethical considerations in developing and deploying apps for children and adolescents are found to be necessary and cannot be undermined, considering mobile apps' influence on them.
The article reviews recent work on dark patterns and demonstrates that the literature does not reflect a singular concern or consistent definition, but rather, a set of thematically related considerations. Drawing from scholarship in psychology, economics, ethics, philosophy, and law, the authors articulate a set of normative perspectives for analyzing dark patterns and their effects on individuals and society and show how future research on dark patterns can go beyond subjective criticism of user interface designs and apply empirical methods grounded in normative perspectives
The authors present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, they study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information than they would otherwise. They examine these dark patterns for deceptive practices, and find 183 erring websites . They also uncover 22 third-party entities that offer dark patterns as a turnkey solution. Finally, they develop a taxonomy of dark pattern characteristics that describes their underlying influence and their potential harm on user decision-making. Based on these findings, they make recommendations for stakeholders including researchers and regulators to study, mitigate, and minimize the use of these patterns.
With the nascent rise of the voice intelligence industry, consumer engagement is evolving. The expected shift from navigating digital environments by a “click” of a mouse or a “touch” of a screen to “voice commands” has set digital platforms for a race to become leaders in voice-based services. The European Commission's inquiry into the consumer IoT sector revealed that the development of the market for general-purpose voice assistants is spearheaded by a handful of big technology companies, highlighting the concerns over the contestability and growing concentration in these markets. This article posits that voice assistants are uniquely positioned to engage in dynamically personalized steering – hypernudging – of consumers toward market outcomes. It examines hypernudging by voice assistants through the lens of abuse of dominance prohibition enshrined in article 102 TFEU, showcasing that advanced user influencing, such as hypernudging, could become a vehicle for engaging in a more subtle anticompetitive self-preferencing.
In this paper, the authors examine the extent to which common UI dark patterns can be automatically recognized in modern software applications. They introduce AIDUI, a novel automated approach that uses computer vision and natural language processing techniques to recognize a set of visual and textual cues in application screenshots that signify the presence of ten unique UI dark patterns, allowing for their detection, classification, and localization. To evaluate this approach, they constructed CONTEXTDP, the current largest dataset of fully-localized UI dark patterns that spans 175 mobile and 83 web UI screenshots containing 301 dark pattern instances. Overall, this work demonstrates the plausibility of developing tools to aid developers in recognizing and appropriately rectifying deceptive UI patterns.
“How does the end user perceive, experience, and respond to dark patterns?” This is the research question which drives this inquiry. The paper contributes to an increased awareness of the phenomenon of dark patterns by exploring how users perceive and experience them. Hence, the authors chose a qualitative research approach, with focus groups and interviews for this. Their analysis shows that participants were moderately aware of these deceptive techniques, several of which were perceived as sneaky and dishonest. They further expressed a resigned attitude toward such techniques and primarily blamed businesses for their occurrence. Users also considered their dependency on services employing these practices, thus making it difficult to avoid fully dark patterns.
Many tech companies exploit psychological vulnerabilities to design digital interfaces that maximize the frequency and duration of user visits. Consequently, users often report feeling dissatisfied with time spent on such services. Prior work has developed typologies of damaging design patterns (or dark patterns) that contribute to financial and privacy harms, which has helped designers to resist these patterns and policymakers to regulate them. However, there is a missing collection of similar problematic patterns that lead to attentional harms. To close this gap, the authors conducted a systematic literature review for what they call ‘attention capture damaging patterns’ (ACDPs). They analyzed 43 papers to identify their characteristics, the psychological vulnerabilities they exploit, and their impact on digital wellbeing. They propose a definition of ACDPs and identify eleven common types, from Time Fog to Infinite Scroll. The typology offers technologists and policymakers a common reference to advocate, design, and regulate against attentional harms.
This article discusses the results of the authors’ two large-scale experiments in which representative samples of American consumers were exposed to dark patterns. The research also showed the susceptibility of certain groups- particularly those less educated, to dark patterns and identified the dark patterns that seem most likely to nudge consumers into making decisions that they are likely to regret or misunderstand. Hidden information, trick question, and obstruction strategies were shown to be particularly likely to manipulate.
This article analyzes the definition of dark patterns introduced by the California Privacy Rights Act (CPRA), the first legislation explicitly regulating dark patterns in the United States. The authors discuss the factors that make defining and regulating privacy-focused dark patterns challenging, review current regulatory approaches, consider the challenges of measuring and evaluating dark patterns, and provide recommendations for policymakers. They argue that California’s model offers the opportunity for the state to take a leadership role in regulating dark patterns generally and privacy dark patterns specifically, and that the CPRA’s definition of dark patterns, which relies on outcomes and avoids targeting issues of designer intent, presents a potential model for others to follow.
This submission assesses the extent to which extant and forthcoming legislation is equipped to address the pernicious practice of dark patterns through the prism of the digital fairness review.
In light of the regulation of manipulative interfaces by the United States, questions have been raised about the advisability of national or even European regulation of the exploitation of our cognitive biases by designers of digital interfaces. The article looks at the extent then to which the legislation regulates abusive practices, i.e. dark patterns which exploit cognitive biases. Finally, the author proposes that consideration could be given to a principle of the purpose of capturing attention (in particular the collection of attention for specific, explicit and legitimate purposes and the absence of further processing in a manner incompatible with the purposes initially intended).
The author in this paper takes an interesting stance relating to dark patterns and the subject of online privacy- that the current online consent mechanisms do not permit data subjects to think, decide, and choose according to their internal beliefs, therefore impairing essential individual freedoms or capabilities. Cognitive limitations, information overload, information sufficiency, lack of intervenability and lack of free choice are identified as major shortcomings of consent in privacy. Based on these findings, the author proposes a methodology to evaluate old or new design measures to improve consent and reinstall freedoms of thought, decision and choice.
Dark patterns are common in everyday digital experiences, and they present a new challenge to emerging global privacy laws, particularly the European Union (EU) data protection framework and the General Data Protection Regulation (GDPR). The author contends that while there is an apparent lack of legal tools to deal with dark patterns, the current framework can be amended to identify and curb them, especially through a refinement of the requisites for lawfulness of consent and the reformulation of the fairness principle in data protection.
This article outlines and explores the limits of dark patterns- which it describes as the specific ethical phenomenon which supplants user value in favour of shareholder value. It also analyses the corpus of practitioner-defined dark patterns and determines the ethical concerns raised in these examples. Additionally, it identifies the examples which simply fall under a wide range of ethical issues raised by practitioners that were frequently conflated under the umbrella term of dark patterns. At the same time, the researchers acknowledge that UX designers may be complicit in these manipulative or unreasonably persuasive techniques and concludes then, with implications of educating user experience designers and a proposal for broadening research on the ethics of user experience.
In two preregistered online experiments the authors investigated the effects of three common design nudges (default, aesthetic manipulation, obstruction) on users’ consent decisions and their perception of control over their personal data in these situations. In the first experiment (N = 228) they explored the effects of design nudges towards the privacy-unfriendly option (dark patterns) and in the second, they reversed the direction of the design nudges towards the privacy-friendly option, titled “bright patterns”. Through this and overall, findings suggest that many current implementations of cookie consent requests do not enable meaningful choices by internet users, and are thus not in line with the intention of the EU policymakers. They also explore how policymakers could address the problem.
In the past years, regulators around the world found a new focal point in addressing online information asymmetries. This focal point is dark patterns. As this inspired a widespread interest that brings together human-computer interaction, web measurement, data protection, consumer protection, competition law, and behavioral economics – to name a few relevant disciplines – the authors decided to focus this consumer update on this topic. With the help of Cristiana Santos, who is an expert in the conceptualization and detection of dark patterns as privacy violations, they have written a short summary of the research in the field, regulatory concerns, as well as brief critical reflections showing where more attention should be paid.
The research conducted for this study is significant because it shows that dark patterns are prevalent and increasingly used by traders of all sizes and not only large platforms. According to the mystery shopping exercise, 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern and more importantly identified that it was rare to find one dark pattern used in isolation as multiple were often featured on a site. Altogether, it sheds light on the prevalence of dark patterns in the digital world and how a large number of internet players have a hand in this phenomenon.
The world seemingly becomes more consumer-friendly with each generation as businesses take more and more measures to ensure customers are left with a positive experience. Despite our technological advancement and understanding of human nature, it’s still common to see deception in digital products. To understand digital deception (and better define it), we surveyed 536 people to measure and discuss people’s understanding of this matter.
Default decisions-prevalent and influential in areas varying from retirement program designs and organ donation policies to consumer choice. While past research has shown the reason for these no-action defaults mattering due to effort and implied endorsement, there is a dearth of this in relation to reference dependence, i.e. how the default choice can serve as a reference for determining whether the other choices will be positively or negatively evaluated. In the article, the researchers demonstrate how reference dependence can increase the effectiveness of default decisions.
The concept of dark patterns is still ill-conceptualized in the Digital Services Act, the first EU legislative act to tackle dark patterns head-on. The author identifies these concerning aspects to the DSA's prohibition of dark patterns, notably its reference to manipulation as a source of consumer harm. ‘Manipulation’ is not defined in the regulation, and is a new EU legal term. Many philosophers have reflected on the meaning of manipulation and how it manifests in digital environments, and the jury is still out on this question. This article assesses this and related shortcomings, as well as others in relation to the IMCO Committee's proposed amendment of the Unfair Commercial Patterns Directive to better target dark patterns.
The central idea of this working paper is that issues of safety and fairness can no longer be regulated using consumer choice as the primary protection. Instead, consumers need a privacy law that stops harmful business practices before they cause significant harm. Two concepts are explored in this working paper to address both current and emerging data harms: duty of care or best-interests duty and the privacy safety regime. Borrowing concepts from product intervention powers and product safety interventions, the CPRC proposes options that would allow governments and regulators to stop or limit obviously harmful uses of data as well as a process for regulators to proactively restrict and test new harmful practices as they evolve. It concludes that the law needs to require more effort on the part of businesses to assess whether and how they collect, share, and use data that results in fair outcomes for their customers.
Deceptive designs range from those that are ubiquitous and frustrating for consumers to those that are misleading and deceptive and can lead to significant consumer harm. In light of this, this report considers the common types of dark patterns, the impact of dark patterns on consumers, and the next steps businesses, governments, and consumers can take to reduce harm.
This week on Legal Design Thinking IRL, Hannele Korhonen speaks with Marie Potel-Saville. They talk about the dark patterns of the internet, how to identify them and how she’s leveraging legal design to create fair patterns for users. They discuss what dark patterns are, how they affect our everyday lives, how to spot them, why they are bad for businesses, what laws and international standards govern dark patterns, what fair patterns are, how legal design can create fair patterns, why a human-centric approach to users, UX and their interactions with your business is crucial, and how to report dark patterns.
The purpose of this open letter is to help businesses understand and comply with their existing obligations under consumer protection law when making urgency claims (for example, countdown timers, scarcity or ‘act fast’ messages) and/or price reduction claims online. The letter includes examples where common claims made by online businesses to consumers during the shopping process may breach the law, for example by misleading consumers or by putting unfair pressure on them.
This paper discusses the current CMA thinking in this important area for its ongoing programme of competition and consumer enforcement. It also explains why online choice architecture is relevant to both consumer protection policy and competition policy and outlines a novel taxonomy of practices.
HCI research has extensively studied nudging user behaviour and how corporations have often used this as an avenue to influence user information disclosure. In this paper, the researchers test the effect of norm-shaping design patterns on information divulging behaviour. Primarily their findings indicate a key mechanism by which norm-shaping designs can change beliefs and subsequent disclosure behaviours.
In this paper, the authors present a novel solution expanding the Advanced Data Protection Control (ADPC) mechanism to bridge current gaps in user data and privacy control. Their solution moves the consent control to the browser interface to give users a seamless and hassle-free experience, while at the same time offering content providers a way to be legally compliant with legislation. Through an extensive review, they evaluate previous works and identify current gaps in user data control. They then present a blueprint for future implementation and suggest features to support privacy control online for users globally.
Addressing the root causes of (un)sustainability entails fundamentally changing our ways of living. This requires going beyond technology and behaviour-oriented approaches common under the umbrella of sustainable development (SD). More fundamental change is required to increase the possibility of realizing ecological and psychological well-being. Here, such change is conceptualized as ‘characterological change’. Next to SD another domain is introduced: characterological development (CD). The potential role of design-interventions in CD is explored in this article. Two studies were conducted, a literature study and experts interviews, covering the fields of Design for Sustainable Behaviour, Persuasive Technology, Practice-Oriented Design and Philosophy of Technology. The literature study shows that current research and interventions predominantly fall within the domain of SD, leaving character and related notions largely unaddressed.
Online services pervasively employ manipulative designs (i.e., dark patterns) to influence users to do different things. In this article, the researchers investigate whether users were aware of the presence of dark patterns and if so, their ability to resist them. The researchers discover however, that being aware does not equip users with the ability to oppose such influence. They further find that respondents, especially younger ones, often recognise the ”darkness” of certain designs, but remain unsure of the actual harm they may suffer. Finally, they discuss a set of interventions (e.g., bright patterns, design frictions, training games, applications to expedite legal enforcement) in the light of their findings.
Online vendors often employ drip-pricing strategies where mandatory fees are displayed at a later stage in the purchase process than base prices. In this article. the researchers discovered after thorough analysis that disclosing fees upfront reduces both the quantity and quality of purchases. At the same time, detailed click-stream data analysed by the authors show that price shrouding makes price comparisons difficult and results in consumers spending more than they would otherwise.
This research reveals how dark patterns work, namely which vulnerabilities and biases they exploit. From a broader perspective, it would also allow readers to understand how techno-regulation (i.e. regulation through technology) can nowadays be used to influence individuals’ behaviour and autonomy through design. It examines the existing literature of dark patterns and acknowledges how dark patterns exploit biases, heuristics and vulnerabilities as well as the economic reasons behind dark patterns.
This paper focuses on persuasion and user autonomy education. With the rise of persuasive features in interactive systems which are aimed at increasing revenue, gathering user information and maximising user engagement, users’ autonomy has been argued to be an ethical concern within persuasive UX design. Thus, the researchers test a framework to educate design students on the ethics of persuasion from the perspective of user autonomy. Findings showed that following this, their critical attitudes towards persuasive design increased. Based on this among others, they propose future directions for integrating ethics into user experience design.
The Luring Test: AI and the engineering of consumer trust - FTC
Data Privacy and Consumer Protection Practices of Automated Mental Health, Wellbeing and Mindfulness Apps - Centre for AI and Digital Ethics | University of Melbourne
In this Nobel Prize Summit, a workshop on deceptive design is hosted by Transatlantic Consumer Dialogue and the Electronic Privacy Information Center, and the Minderoo Centre for Technology and Democracy, University of Cambridge. This workshop of experts aims to explore how lawmakers and regulators on both sides of the Atlantic are tackling the issue of deceptive design and explore how lessons learned on both shores can help provide global regulatory solutions. Several notable speakers will take part in the workshop, including Harry Brignull, who first coined the term "Dark Pattern" in 2010.
The DITP's Behavioral Sciences department is working with the CNIL to objectivize the impact of the design of cookie banners, the pop-ups that appear when a site is accessed. The challenge: to ensure that citizens' free will is respected, and that regulations on personal data protection are effective. To shed light on the effects of dark patterns on users, studies are based on behavioral sciences. A study involving over 4,000 people was carried out on cookie banners, and the results of this experiment "confirm the considerable impact of banner design on the choices made by Internet users".
Critical designer, researcher and award-winning artist Caroline Sinders has created a playful website to show how companies deliberately make it difficult to unsubscribe from their services. She conducted her experiment on 16 well-known online services (Amazon, Google, Netflix and the New York Times). A total of 20 different dark patterns were identified across all the applications tested, with impressive consequences: $330 lost and almost 1 hour spent trying to unsubscribe.
Dark patterns and sludge audits: an integrated approach - Cambridge University Press
The National Assembly and the Korean government are actively addressing the issue of “dark patterns” in the online space. Momentum to regulate such deceptive online practices is anticipated to only grow in the coming months.
Two professors from Northeastern University plan to to conduct a study at Northwestern on dark patterns that are present in AI-enabled consumer experiences.
noyb filed three complaints against Fitbit in Austria, the Netherlands and in Italy. The popular health and fitness company, acquired by Google in 2021, forces new users of its app to consent to data transfers outside the EU
Discover the factors driving the integration of dark patterns in Conversational Agents (CAs). The qualitative study reveals six key drivers, shedding light on the subtle tactics that impact user autonomy in digital interactions.
In a decisive move, India has introduced guidelines to combat Dark Patterns in digital interfaces. These guidelines define and prohibit deceptive design practices that mislead users, emphasizing consumer protection and ethical design. Public feedback is encouraged until October 5, 2023, to ensure a fair digital landscape.
Threads app's meteoric rise has intrigued the internet, with millions signing up in mere hours. Yet, beneath its user-friendly facade, lurk dark design patterns that compel user actions. Discover the hidden tactics that keep you engaged and contribute to Meta's data empire.
On September 1, 2023, the DPC concluded its inquiry into TikTok's GDPR compliance, finding multiple violations. TikTok received a reprimand, a three-month compliance order, and €345 million in fines. This followed objections and a binding decision by the EDPB on August 2, 2023.
A recent study by Northeastern University found that Google and Amazon use your voice interactions with their assistants to gather data about you, potentially influencing the ads you see and how these voice assistants respond to you. Apple's Siri, on the other hand, seems to maintain a more privacy-conscious approach.
The paper delves into the realm of dark patterns within digital systems, utilizing Amazon Prime's 'Iliad Flow' as a case study. Additionally, it introduces the 'Temporal Analysis of Dark Patterns' (TADP) methodology to investigate how these deceptive design tactics influence user journeys. TADP takes into account individual dark patterns, their cumulative effects, and the implications for detection.
In a digital age marked by deceptive design practices, our paper delves into the world of 'dark patterns'—strategies that extract profit, harvest data, and limit consumer choice. We recognize that the absence of universally accepted definitions across academic, legislative, and regulatory spaces hinders meaningful action. Our work aims to establish a common language by harmonizing existing taxonomies and proposing a three-level framework with standardized definitions for 64 dark pattern types. This framework empowers us to advance research and regulatory efforts, fostering transparency and ethical design across digital domains.
This study investigates how financial technology companies employ dark patterns to influence investors' decisions. We analyzed 26 mobile apps in Norway for stocks, funds, and cryptocurrencies, aiming to identify unethical design strategies. Most apps, to varying degrees, incorporate dark patterns. Banks prioritize user data protection, while non-bank fintechs employ more deceptive practices to manipulate user behavior and interaction.
LLM-based conversational agents like ChatGPT are being used in high-stakes domains, but this exposes users to privacy risks, including data breaches and the memorization of personal information. This study focuses on user behavior and perceptions, highlighting the need for privacy-preserving techniques and improved user awareness.
A study by the European Commission revealed that dark patterns are common on popular websites, and a survey of mobile apps confirmed this trend. These deceptive practices have a significant impact on consumer decisions, often prompting them to make purchases. Consumer protection authorities, such as the DGCCRF in France, take legal action against companies using these dark patterns, but there are currently few tools to identify and sanction them.
BIT's Gambling Policy & Research Unit conducted a study on 10 gambling and betting operator websites in March-April 2022. Key findings reveal issues such as longer account closure times compared to account opening, challenges in setting deposit limits, minimum balances required for withdrawals, lack of customer feedback on gambling habits, and suboptimal default settings. Behavioral Risk Audits serve as essential tools to examine online markets, offering insights into consumer experiences and market dynamics for policymakers and industry participants.
In a study of 200 popular Japanese mobile apps, we discovered that most of them employed dark patterns, averaging 3.9 per app. Notably, we identified a new class of dark patterns called "Linguistic Dead-Ends," such as "Untranslation" and "Alphabet Soup." These findings emphasize the need for culturally sensitive design standards and further research on dark patterns in cross-cultural contexts.
In e-commerce, limited-time and limited-quantity cues are often used to influence consumer decisions. However, our study with 202 participants suggests that these cues can lead to perceptions of lower benevolence from online vendors. Websites without such cues provided a better user experience. Limited-time cues even elicited frustration. These findings have implications for dark pattern research, designers, and online vendors, emphasizing the need for ethical design in e-commerce.
The study focuses on a specific dark pattern within cookie consent dialogs on websites: the absence of a clear opt-out option alongside the opt-in choice. This pattern was investigated across five European countries by crawling 23,303 websites. The findings were striking: 13,522 websites had an accept option, but only 6,016 had a reject option on the initial layer. This suggests that more than half of the websites they analyzed employed this dark pattern, potentially violating GDPR and highlighting the need for improved ethical design practices.
In today's digitalized society, mobile digital games (MDGs) play a pivotal role in early childhood education. However, concerns about problem gaming and unethical design patterns in MDGs for young children have emerged. The analysis of the top five free games for ages 0-5 on the App Store (February 2023) revealed the presence of temporal, monetary, and psychological dark patterns. These include aesthetic manipulations, paywalls, and elements resembling gambling rewards. The study emphasizes the need for digital literacy and responsible game design to ensure healthier gaming experiences for young children.
In the realm of online privacy, dark patterns are design strategies that often lead users to disclose more information than they intend. The research, focusing on older adults (above 65) and young adults (18-25), reveals the effectiveness of these dark patterns. Positive framing and opt-out privacy defaults increase disclosure behavior, while negative justifications reduce privacy concerns. Notably, older adults show both increased disclosure and heightened privacy concerns when exposed to these strategies. Privacy concerns, however, do not hinder disclosure, highlighting the potential risks dark patterns pose, especially to older users. This raises important ethical questions about safeguarding digital privacy for all age groups.
In the evolving landscape of online commerce, dark patterns pose a significant challenge, blurring the line between persuasion and manipulation. This Article focuses on the European Union's legal response to these issues, highlighting the need to protect consumer autonomy. It introduces a novel framework that classifies dark patterns into six categories of autonomy violations, providing policymakers with a clear path to regulate and safeguard consumers' decision-making independence.
This research aims to uncover and categorize deceptive design practices within 3D environments commonly found in PC games. Through a survey involving 259 adult respondents, we identified six distinct categories of deceptive design patterns within a popular free-to-play 3D game: Predatory Monetization, Default to Purchase, UI Misdirection, Emotional Interpersonal Persuasion, Physical Placement, and Narrative Obligation. This work is vital as 3D and VR gaming gain momentum, and the gaming industry increasingly adopts "freemium" monetization models. It highlights the importance of understanding and addressing deceptive design in this evolving gaming landscape.
This document provides supplemental materials directly cited in the proceedings of the USENIX Security Symposium 2024 entitled “The Effect of Design Patterns on (Present and Future) Cookie Consent Decisions”
This paper investigates the impact of "loss-gain framing" as a dark pattern strategy on user data disclosure behavior in mobile settings. Understanding how framing affects users' willingness to share personal information is essential for privacy policy development and user interface design. In an online user study involving 848 participants, they tested different framings (positive, negative, neutral) of app permission requests. Surprisingly, negative framing increased disclosure rates, while positive framing reduced them, possibly due to heightened suspicion. These findings carry implications for designing interfaces that promote informed, privacy-conscious decision-making.
The study investigates compliance with privacy laws like Article 8 GDPR and CCPA on children's websites. They analyzed 2,066 educational and gaming websites, finding that only a minority address consent dialogs for children. This suggests potential non-compliance with GDPR, raising concerns about data protection for young users.
Website cookie dialogs are common for safeguarding user privacy, but many employ manipulative tactics, known as dark patterns, to make it harder for users to opt out. These tactics violate privacy laws and can lead to fines, as seen with Google and Meta in 2022. The paper "DarkDialogs" presents an automated tool that detects 10 dark patterns in cookie dialogs with high accuracy, highlighting the prevalence of such practices.
In the framework, the report has redefined the concept of dark patterns in user interface design. They focused on the interaction between users and applications, grounded in user expectations and the reuse of common concepts. Design is considered "dark" when it intentionally violates these expectations to benefit the application provider. Through case studies, they demonstrated how this concept-based analysis can help designers identify and address such issues. They propose a shift away from traditional taxonomies of dark patterns towards a more systematic, actionable approach to ethical interface design.
In the wake of the AMG Capital Management, LLC v. FTC Supreme Court decision limiting the FTC's ability to seek monetary relief, concerns about regulating online "dark patterns" persist. Dark patterns are deceptive online interfaces that trap users into unwanted actions. This Note argues for an FTC rule to define and prohibit dark patterns, benefiting consumers, regulators, and businesses by providing clarity, better enforcement tools, and protection against deceptive online practices. As online reliance grows, the FTC's regulatory approach must adapt to protect consumers and maintain a fair marketplace.
In the realm of digital interfaces, the prevalence of deceptive design, often referred to as "Dark Patterns," is on the rise. While existing research has explored user perceptions of these manipulative tactics, there is a notable dearth of studies examining the potential impact on users' mental health. This paper seeks to identify vulnerable user demographics and formulate research questions to initiate discourse on the potential adverse effects of Dark Patterns on mental well-being. By addressing this gap, the paper aims to pave the way for a more nuanced understanding of the ethical considerations surrounding deceptive design in user experience.
The study on e-commerce dark patterns reveals a significant inclination among participants (195 adults aged 19 to 53) to select products with manipulative tactics, particularly the potent limited-time message dark pattern. Age plays a role, with older individuals showing heightened vulnerability. Despite exploring video- and activity-based interventions, their effectiveness remains unclear. The onus is on companies to foster a transparent and ethical consumer environment by refraining from dark pattern usage. This study underscores the urgent need for a collective effort to mitigate the unintended consequences of manipulative design in digital commerce.
The study explores UX dark patterns, deceptive UI designs prevalent in online services. Building upon previous work addressing dark patterns from the designer's and policymaker's perspectives, the paper proposes an end-user-empowerment approach. This aims to raise user awareness of dark patterns, help them understand design intents, and empower them to counter these effects using web augmentation. The two-phase co-design study, comprising five workshops and a two-week technology probe, delves into user needs, preferences, and challenges, offering insights into their reactions to empowered awareness and actions in a realistic in-situ setting.
This study addresses the privacy risks posed by browser cookies, particularly third-party ones, through a personalized cookie banner. Semistructured interviews, identified user attitudes and requirements. The subsequent online experiment evaluated a personalized privacy assistant, a non-personalized version, and a standard website cookie banner. Results showed both versions of the novel banner significantly reduced accepted cookies and improved usability compared to the standard banner. Importantly, the personalized variant outperformed the non-personalized, emphasizing the efficacy of tailoring cookie banners to users' privacy knowledge for informed choices and enhanced privacy protection.
In the digital landscape, advertisements play a pivotal role, but their impact on blind users navigating websites with screen readers has been a largely unexplored territory. Through interviews with 18 blind participants, our study revealed a significant challenge: blind users are often misled by ads seamlessly integrated into the web page content. Conventional ad blockers face resistance, compelling us to devise an algorithm for automatic identification of contextually deceptive ads. Our multi-modal model, incorporating both handcrafted and automatically extracted features, demonstrated remarkable effectiveness with F1 scores of 0.86 and 0.88 on test datasets and real-world websites, respectively. This research sheds light on the nuanced experiences of blind users and introduces a practical solution to enhance their online interactions by addressing the deceptive nature of visually designed ads.
This research investigates the impact of dark design patterns on user experience within internet and mobile applications. Dark patterns, manipulative strategies employed by system owners, pose risks by coercing users into sharing excessive personal information or engaging in unintended subscriptions. Through a literature review, this study aims to identify prevalent dark patterns, examine their effects on user behavior, and discuss the ethical implications in Human-Computer Interaction (HCI). Despite limitations, this research offers valuable insights into these exploitative tactics and their implications for user experience.
Dark patterns in consumer marketing exploit cognitive biases, steering individuals towards decisions that conflict with their genuine preferences. These manipulative tactics, designed by digital platforms, compromise consumers' autonomy for economic gain. This study exposes these covert strategies, advocates for regulatory reforms under the Consumer Protection Act of 2019 in India, and aims to safeguard consumers from the deleterious impact of dark patterns.
In this research, they have delved into the pervasive issue of Dark User Interface (DUI) patterns within mobile apps, particularly focusing on China's mobile ecosystem. The systematic investigation reveals the prevalence of deceptive UI designs that can mislead users into unintended actions. With a taxonomy of DUI patterns and analysis of top mobile apps, they highlight the urgent need for better regulation and user awareness to mitigate potential harm caused by these deceptive practices.
The research investigates the transformation of privacy dialogs on 911 US and EU news and media websites in the 18 months following the GDPR implementation. The researchers observed a positive trend: an increase in privacy dialogs offering clear choices to accept or reject tracking, accompanied by a decrease in manipulative nudges. This shift suggests that external interventions, such as government guidance, may prompt websites to improve GDPR compliance and make it easier for users to reject tracking.
Within the digital landscape, the regulation of exploitative practices targeting consumer behavioral biases poses a critical challenge in EU policymaking. Despite its prevalence in discussions, the concept of exploitation lacks a precise legal definition, especially concerning online choice architectures. This article seeks to rectify this ambiguity by proposing an autonomy-focused theory of exploitation. Departing from traditional welfare analysis, the aim is to align EU consumer law with the preservation of consumer autonomy rather than solely optimizing market efficiency. By establishing clearer parameters for exploitation in online contexts, this framework aims to strengthen regulatory measures in safeguarding consumers within the evolving digital marketplace.
In the digital landscape, consumer autonomy faces unprecedented challenges due to the pervasive use of dark patterns. These subtle manipulative tactics alter online choice architectures, steering users towards decisions not aligned with their preferences. Traditional information-based approaches fall short in addressing this issue. The book takes a novel approach, merging transparent information and fair digital design. Through a comparative study spanning data protection, consumer, and competition law, they aim to integrate legal rules with ethical design principles. This inclusive methodology considers non-legal insights, creating pragmatic and global regulatory paths to safeguard digital consumer autonomy effectively.
Dark patterns, deceptive techniques ingrained within interfaces, exert substantial influence, altering users' choices and autonomy online. They're notably effective within mobile applications, particularly mobile gaming, leveraging fast, heuristic decision-making (System 1, Kahneman). Beyond individual impact, these practices raise concerns about broader societal consequences, questioning our collective relationship with technology when misaligned with human interests. This communication aims to outline existing regulatory frameworks governing dark patterns, pinpointing their limitations. Additionally, it seeks sustainable regulatory solutions that account for human cognitive limits. By bridging these gaps, our goal is to establish a regulatory landscape prioritizing user autonomy and ethical use of technology in the digital domain.
This paper explores interpretable dark pattern auto-detection in e-commerce interfaces, leveraging BERT, a transformer-based language model. By training the model on a text-based dataset, they identified deceptive designs and employ post-hoc explanation techniques like LIME and SHAP to unveil the terms influencing each dark pattern prediction. Their findings, aimed at preventing user manipulation, offer insights into constructing more equitable internet services.
This study explores the use of dark patterns in interface design, focusing on their manipulation of user actions, particularly in obtaining consent for extensive data collection. Introducing a narrative serious game with seven game-adapted dark patterns, they have aimed to enhance awareness and resistance. Through a qualitative, exploratory study, they investigated player behavior when confronted with these adapted patterns. Thematic analysis reveals insights into factors influencing pattern adaptation in gameplay, as well as motivations and driving forces shaping player behavior.
In this workshop, participants addressed the pervasive issue of "dark patterns" in digital design—deceptive, manipulative practices impacting user autonomy. Human-computer interaction scholars have laid the groundwork, defining types and harms. The focus was on actionable steps: (i) refining detection methodologies, (ii) characterizing harms, and (iii) crafting effective countermeasures. By connecting scholarship to legal and design communities, the goal was to influence legislation and foster ethical digital practices.