A collection of research and articles about dark patterns, user behaviour and law
Elevate your business with 10x more data, advanced filters, and exclusive insights.
The 14th Design Thinking Research Symposium (DTRS14), hosted by Mälardalen University, focuses on how design can drive sustainable futures by addressing global challenges like climate change, social inequality, and resource management. Aligned with the UN's Sustainable Development Goals, it emphasizes interdisciplinary collaboration and human-centered design methods such as co-design and systems thinking. The symposium showcases global insights on using design to foster sustainability and societal progress.
A recent study of 16 U.S. gig work platforms reveals extensive data collection, sharing with up to 60 third parties, and privacy dark patterns that heighten risks for workers. Findings include reversible Social Security Number (SSN) hashes and off-platform nagging tactics. Platforms addressed disclosed SSN issues, confirmed by independent audits. The study calls for stronger privacy protections and offers its data for further research.
A recent study explores how web developers using GPT-4 may unintentionally create deceptive designs (DD) in website features. Involving 20 participants, the study tasked users with generating product and checkout pages for a fictitious webshop using ChatGPT, followed by adjustments to boost sales. All 20 websites ended up containing DD patterns, averaging 5 per site, with GPT-4 offering no warnings. Notably, only 4 participants expressed ethical concerns, with most viewing the designs as effective and acceptable. These findings underscore potential ethical and legal risks in relying on LLM-generated design recommendations without careful oversight.
Antitrust laws often struggle to curb Big Tech’s anticompetitive business models, where dark patterns complicate enforcement. Austrian criminal law may fill this gap by addressing potentially deceptive practices. Amazon’s Prime and BuyBox, Google’s Search and Bard, and mechanisms like Activision Blizzard’s lootboxes and Engagement Optimised Matchmaking (EOMM) algorithms may constitute commercial fraud under §§ 146, 148 StGB, as they appear designed to mislead consumers for profit.
This special issue aims to deepen research on deceptive designs—manipulative digital features that exploit cognitive biases to influence user choices without full consent. As these designs proliferate on digital platforms and are enhanced by AI and immersive technologies, regulatory efforts across the EU, US, and India seek to curb their use. This issue invites interdisciplinary studies on the behavioral impacts of deceptive designs, effects on vulnerable users, regulatory strategies, and theoretical advancements in cognitive biases. Insights from psychology, economics, HCI, and law will support a framework for understanding and mitigating these designs in evolving digital contexts.
This chapter presents research on user experience in video games, aiming to identify core parameters that address user needs. Key factors—usability, engageability, motivation, emotion, immersion, and commitment—were analyzed as essential elements of a positive gaming experience. A specific heuristic evaluation was developed, incorporating these principles along with targeted questions to enhance video game assessments. This extended evaluation, optional and tailored for gaming, provides a refined tool for analyzing user experience in video games.
As the data ecosystem becomes increasingly intertwined with daily life, every action generates value within the data economy, raising the need for authorities to protect diverse and sometimes conflicting stakeholder interests. Despite existing regulations, effective compliance remains challenging due to a lack of suitable technical and organizational tools and adequately prepared oversight bodies. This paper explores how FACT principles, FAIR principles, and Open Personal Data Stores could guide ethical pathways for the data economy. Through ethical, economic, and legal lenses, it presents findings from literature review and content analysis, offering recommendations to help stakeholders better integrate ethics into the data economy.
Social robots, designed to foster trust and companionship, often incorporate elements that can lead to user manipulation and addiction. To address these risks, this thesis proposes using Vulnerability Theory to assess Human-Robot Interaction by balancing dependency with resilience, ensuring ethical, user-centered design. The "Safety and Intimacy in Human-Robot Interaction Pipeline" serves as a tool to analyze these interactions, supporting regulations aimed at protecting users from manipulation.
This study presents the design of a field experiment examining the effects of online tracking, targeting, ad-blocking, and anti-tracking technologies on consumer behavior and economic outcomes. While the online data industry promotes the benefits of tracking for targeted advertising, privacy concerns persist regarding the extensive tracking of consumers without their knowledge. The experiment aims to analyze how these technologies influence online browsing, shopping behavior, and purchasing decisions, focusing on metrics like spending, product prices, search time, and purchase satisfaction. The study outlines the rationale, experimental design, and data collection plans for these investigations.
Protecting the rights of algorithmic consumers is crucial for the healthy growth of the digital economy. Dark patterns, which manipulate consumer decision-making through platform interfaces, disrupt traditional marketing methods. These patterns exploit cognitive biases, reducing consumer autonomy and rationality. They can be categorized as inducing, deceptive, or manipulative, each with varying impacts, such as compromising personal information and undermining consumer autonomy. The U.S. and the EU tackle dark patterns through legislation, with the U.S. focusing on specialized laws and the EU combining rules and standards. A governance approach tailored to local contexts, involving regulation, protection, and collaboration, is needed to balance platform operators' interests and safeguard consumers, fostering a healthier digital economy.
A study on dark patterns in social media reveals their influence on user engagement and impulse buying. Based on 492 participants from PSUT in Jordan, the research found that time, effort, pleasure, and social acceptance positively affect engagement, while non-routine behavior has a negative impact. Engagement, in turn, boosts impulse buying. The study emphasizes the need for ethical marketing, as dark patterns can erode trust and harm long-term relationships.
This thesis explores how dark design patterns in casual mobile games exploit cognitive biases. While ethical concerns have been discussed, the cognitive mechanisms behind these patterns are less understood. Through interviews with mobile game players, the study finds that monetary tactics like hidden costs are more easily recognized than social or temporal patterns, with loss aversion being a key bias exploited. Many players express frustration with these manipulative designs. The thesis calls for greater awareness and discussions on ethical design and regulatory measures in mobile gaming.
A recent study explores the concept of Interactive Dark Patterns (IDPs), a subset of dark patterns that engage users through multiple steps or screens to manipulate their choices. While dark patterns have been widely studied since Harry Brignull's identification of them in 2010, the interactive nature of some patterns has been overlooked. This research aims to empower ordinary mobile application users by increasing their ability to recognize and avoid these deceptive design tactics through targeted training interventions. The study found that such training significantly reduced user entrapment, with the entrapment rate dropping from 79% to 27% post-training, highlighting the effectiveness of awareness-building in mitigating the impact of IDPs. These findings have broader implications for mobile app developers, UX/UI designers, and digital ethicists, emphasizing the importance of ethical design practices.
Dark patterns, which manipulate consumer behavior, have become widespread in digital markets, influencing how online consumers perceive, behave, and make purchases. A novel empirical study reveals that individuals across all demographic groups are vulnerable to these manipulative tactics, with little evidence suggesting that common markers of consumer vulnerability, such as income, education, or age, significantly reduce susceptibility. These findings support broad restrictions on the use of dark patterns, as proposed in the EU’s Digital Services Act, to protect all consumers. The study also highlights that dark patterns are less effective when added friction, such as a required payment step, is introduced. This indicates that dark patterns are most effective in scenarios where no further user action, such as additional payment steps, is required, particularly when online providers store payment information for 'single-click' purchases.
A recent evaluation of alcohol-industry-funded (AIF) digital tools, such as blood alcohol calculators and consumption trackers, has revealed significant concerns regarding misinformation and manipulative design tactics. These tools, distributed by organizations like Drinkaware and Drinkwise, were found to contain health misinformation and "dark patterns" that potentially influence users towards increased alcohol consumption. Compared to non-industry-funded tools, AIF tools provided significantly less accurate health feedback (33% vs 100%), omitted crucial information about cancer and cardiovascular disease, and promoted industry-friendly narratives. Moreover, these tools employed techniques like priming nudges and social norming to encourage consumption, offering fewer behavior change techniques and restricted user input options. The study concludes that AIF tools may function as covert marketing channels, misleading users and promoting continued alcohol consumption through manipulative "Dark Apps" design strategies.
In recent years, concerns have intensified regarding the influence that digital architectures exert on individuals and societies, particularly through the use of dark patterns—manipulative interface designs embedded in digital services. While these tactics can affect anyone, traditional legal frameworks often focus on certain vulnerable groups, such as children. However, empirical research indicates that vulnerability to dark patterns is shaped by various factors, beyond the traditional group-based classifications. This article aims to deepen the understanding of this issue by offering a multidisciplinary analysis of the factors contributing to vulnerability, evaluating the feasibility of risk assessments in three EU legal frameworks—the GDPR, the Digital Services Act, and the Artificial Intelligence Act—and proposing strategies to enhance resilience to manipulative digital designs.
Recent research highlights the growing use of dark patterns—malicious interface design strategies that push users into making decisions against their best interests—across apps and websites. While previous studies have primarily focused on how these manipulative tactics affect adults, children are increasingly exposed to them. To explore how dark patterns impact younger audiences, researchers conducted a study with 66 fifth-grade students (aged 10–11) at a German school. The study revealed that many children could recognize simple manipulative tactics, such as complex wording and color-based manipulations, but most struggled to identify bad privacy defaults, indicating a particular vulnerability to privacy-related dark patterns.
This literature review explores the complex interactions between dating app ecosystems, advertising strategies, and the integration of dark patterns—subtle design techniques that influence user behavior. Through an analysis of existing research, the review highlights how dating apps strategically use advertising to boost user engagement, shape interactions, and increase revenue. It also addresses the ethical concerns surrounding the use of dark patterns, questioning the fine line between guiding user decisions and manipulating behavior. Additionally, the review examines user data privacy, focusing on how dating apps collect and utilize personal data for targeted advertising, while raising concerns about potential privacy risks and the regulatory measures in place to protect users. By synthesizing these themes, the article contributes to an ongoing discussion about responsible tech design, emphasizing user well-being, transparency, and ethical standards in the digital space.
The growing prevalence of cyberbullying on online social platforms has highlighted the need for effective detection and mitigation strategies. This study introduces a comprehensive approach to preprocessing and analyzing Twitter data to identify instances of cyberbullying. The research begins with loading a dataset of tweets labeled as positive or negative and conducting exploratory data analysis to understand sentiment distribution. The text data is then preprocessed through steps such as noise removal (eliminating URLs, mentions, punctuation), stopword removal, and lemmatization, which enhances the quality of the dataset. The study also examines word and character count distributions to gain insights into tweet lengths. This methodology lays the groundwork for further investigation into the patterns of cyberbullying, contributing to the creation of data-driven solutions to combat such behavior and fostering a safer online environment.
The increased time spent by users in virtual environments and the influence of AI-driven choice architectures have raised concerns about how AI systems can subtly persuade users' actions. This issue is examined through the concept of "dark patterns," which are interface designs that manipulate user behavior in ways that may not be beneficial to them. Although regulatory measures have been introduced in various regions, including India, the article argues that these regulations are insufficient to address AI-powered dark patterns. Such patterns operate at a deeper behavioral level, making users' choices appear uninfluenced by manipulation. The article concludes by advocating for a more comprehensive approach to tackle the persuasive tactics employed by AI systems.
The California Consumer Privacy Act (CCPA) mandates that businesses offer consumers a clear way to opt out of the sale and sharing of their personal information. However, many businesses leverage control over the opt-out process to impose obstacles on consumers, including the use of dark patterns. The enactment of the California Privacy Rights Act (CPRA) aims to strengthen the CCPA and explicitly prohibits certain dark patterns in these processes. Despite these regulatory efforts, research shows that websites continue to employ a variety of dark patterns, some of which exploit legal loopholes, highlighting the need for further action by policymakers.
This book examines the evolution of legal design, which uses design methods to make legal systems more accessible. Initially focused on problem-solving, legal design now incorporates speculative design, proactive law, and insights from fields like cognitive science and philosophy. Featuring twelve essays from the 2023 Legal Design Roundtable, the book offers diverse perspectives from academics and professionals, exploring new approaches and practical applications. It’s a valuable resource for those interested in innovative, human-centered approaches to law.
Dark patterns are deceptive design practices that impair users' ability to make autonomous and informed decisions, as defined by the European Digital Services Act (DSA). These patterns manipulate users into actions that benefit service providers, such as accepting disadvantageous terms, making unwanted financial transactions, or disclosing personal information. Despite increased regulatory attention and scholarly interest, dark patterns continue to proliferate across digital platforms. This thesis examines dark patterns across various digital contexts, including web and mobile interfaces, IoT devices, and social robots, combining human-computer interaction studies with legal analysis to identify opportunities for mitigating their negative impact.
Technological progress has led to the blending of market techniques from various economic sectors, resulting in the use of diverse rules and instruments and the emergence of novel market practices. This disruptive innovation, especially evident in the crypto-asset market, challenges the application and interpretation of existing rules against market malpractice. After analyzing digital market practices like gamification and dark patterns, this thesis examines the compatibility and adaptability of rules on unfair commercial practices within the evolving digital landscape of the crypto market.
This thesis explores how dark patterns in cookie consent dialogues influence user behavior. An experiment using eye tracking revealed that while wording on button labels significantly affected task completion times, visual design changes did not. Participants generally read from left to right, but individual habits played a larger role in their interactions, indicating that established design conventions are more influential than minor visual tweaks in guiding user decisions.
This study examines the hidden costs and benefits of zero-price digital services, highlighting that while users derive significant value from "free" apps, they also face challenges like procrastination, sleep deprivation, and reduced focus. Based on survey data from 196 participants in Linköping, Sweden, the research reveals a growing consumer preference for paid services over free ones, suggesting a shift in attitudes towards digital payment models. The findings underscore the need for greater corporate transparency and user awareness about non-monetary costs, as well as a balanced approach to user protection and innovation in the digital economy.
In the world of online shopping, "dark patterns" are deceptive design tactics that manipulate consumers into unintended actions like purchases or subscriptions. This study examines their prevalence on fashion websites, finding that 78.4% use these tactics, with Nagging and Limited-time Messages being the most common. Fashion sites use more visual dark patterns than general e-commerce sites, and there's a link between these tactics and website popularity. While user reactions to dark patterns vary, prior trust in a brand can reduce their negative impact.
This chapter explores the challenges posed by data-driven technologies to consumer consent, particularly in the context of manipulative practices known as dark patterns. It begins by examining various dark design strategies that exploit consumer vulnerabilities, especially cognitive biases. The chapter then critiques the limitations of the European information-based approach, which forms the foundation of data protection and consumer law, arguing that current regulations are ill-equipped to address digital vulnerabilities. The analysis highlights the inadequacy of a "one-size-fits-all" model like the GDPR and advocates for a more holistic approach to consumer consent protection. It suggests that integrating legal considerations into the design phase of technological architectures could enhance the protection of consumers' authentic choices and prevent manipulative practices.
This study examines the impact of "dark patterns," interface designs that nudge consumers into sharing data under regulations like the GDPR. A field experiment shows that, even without dark patterns, consumers accept cookies more than half the time. Dark patterns, especially those hiding consent options behind extra clicks, significantly influence choices. Larger, well-known firms see slightly higher consent rates, but site popularity doesn't affect the impact of dark patterns. The study also finds no evidence of choice fatigue from repeated pop-ups.
"Dark patterns," deceptive designs that lead users to benefit service providers, are common in digital marketing. While deceived users often face financial or time costs, non-deceived users—those who recognize and avoid these patterns—may also experience stress and frustration due to the extra effort required. This study focuses on these non-deceived users, exploring how the effort to avoid dark patterns can negatively impact usability and potentially erode trust in service providers.
Advancements in Mixed Reality (MR) technology have made it more accessible to consumers, but this has also led to an increase in dark patterns—manipulative design tactics that deceive users. This research examines these tactics in MR environments, analyzing 80 applications and identifying five key dark patterns: Hidden Costs, Misinformation, Button Camouflage, Forced Continuity, and Disguised Ads. The study highlights the harmful impact of these patterns on user trust and decision-making.
Although numerous Human-Computer Interaction (HCI) studies have empirically investigated the harms caused by dark patterns, and policymakers and regulators acknowledge these harms as significant, they have yet to be thoroughly examined from a legal perspective. This paper addresses this gap by identifying and analyzing the harms associated with dark patterns (DP), focusing on their role in the emerging European 'dark patterns acquis'. The paper organizes existing knowledge on dark pattern harms from HCI research and proposes a taxonomy of these harms. It also bridges the discussion of dark pattern harms in HCI with the legal frameworks for assessing harms under European data protection, consumer law, and competition law.
This research highlights that the growing number and diversity of dark patterns means that each specific form, and its legal assessment, must be evaluated based on how it is used and the intentions behind its use.
This book contains the refereed proceedings of the 12th Annual Privacy Forum on Privacy Technologies and Policy (APF 2024), held in Karlstad, Sweden, on September 4–5, 2024. The conference featured 12 full papers, carefully selected from 60 submissions, and aimed to bring together experts from policy, academia, and industry to discuss privacy and data protection. The 2024 conference particularly focused on the General Data Protection Regulation (GDPR) and emerging legislation around European Data Spaces and Artificial Intelligence. Chapters 3, 9, and 12 are licensed under the Creative Commons Attribution 4.0 International License, as detailed in the respective chapters.
As video games have become a mainstream form of entertainment, they have sparked new media concerns, including gaming addiction, screen time effects, gambling-like mechanics, dark patterns, and online toxicity. Additionally, issues like harassment, discrimination, and poor working conditions in the gaming industry are gaining attention. To address these concerns, the first Ethical Games Conference in 2024 brought together research on ethical issues in gaming, aiming to create evidence-based guidelines for the industry and regulators. This special issue features selected papers and opinion pieces, highlighting challenges and exploring how games can be leveraged for positive social impact.
"Light Up" is an educational game designed to expose the prevalence of UX dark patterns on websites. It simulates real-world scenarios where users are manipulated into actions they might avoid, helping players identify and understand these deceptive tactics. The game aims to raise awareness of the harm caused by dark patterns and empower users to resist them, protecting their privacy, finances, and autonomy online.
Digital nudging has gained prominence as a research topic in information systems, typically viewed as a positive engagement strategy. However, this paper critically examines how digital nudging can offend users' dignity. Using CARE theory, which suggests people react negatively to dignity affronts, the study analyzes 42 interviews from a three-month data collection involving a mobile app with daily digital nudges. The findings highlight that digital nudges can provoke forfeit, flight, or fight responses, and may even become dark patterns under certain conditions, despite responsible design. The paper contributes theoretically by conceptualizing digital nudging, offers empirical insights into its dual nature, and provides practical design guidelines to avoid dignity affronts.
As digital interfaces grow more prevalent, ethical concerns, particularly around dark patterns—manipulative design tactics used to influence user behavior—have become a critical area of study. This research introduces the Dark Pattern Analysis Framework (DPAF), which offers a taxonomy of 64 dark patterns. Current detection tools and datasets only cover 50% of these patterns, revealing significant gaps. The findings underscore the need for improvements in the classification and detection of dark patterns, offering key insights for future research.
The widespread use of manipulative designs, or dark patterns, in everyday applications and their impact on users is raising concerns among policymakers and scholars. These designs employ techniques that nudge users into making decisions they might not choose if fully informed, causing various types of harm. The integration of these mechanisms with other platform features makes it difficult for users to recognize the manipulation. Understanding the effects of manipulative designs is crucial for developing protective countermeasures, but researchers face significant methodological challenges. Investigating the impact of manipulative designs is complicated by the fact that users often do not perceive the manipulation. This paper reflects on these challenges through three case studies, highlighting key issues and providing methodological insights for the empirical study of manipulative designs.
Nowadays, websites commonly use two concepts to influence user behavior: deceptive patterns and nudges. In the literature, these concepts are distinguished by their goals and effects—deceptive patterns manipulate users, while nudges encourage better decision-making. However, from a technical perspective, it is unclear if they differ in their implementation. This paper presents a methodology developed to determine whether it is possible to automatically differentiate between deceptive patterns and nudges when crawling a web page. Our findings suggest that there is no need to distinguish between the two concepts, as they are implemented using the same techniques.
Navigating the web has become increasingly difficult for users due to manipulative UI design patterns known as "dark patterns," which lead users to act against their best interests. These tactics are prevalent, yet users remain largely unaware of them. Existing detection methods, including machine learning algorithms, struggle to generalize across all dark patterns due to their varied definitions and implementations. This paper proposes crowdsourcing as a solution to detect and flag dark patterns. Crowdsourcing leverages users' collective experiences to identify these manipulative designs more effectively. The authors introduce Neighborhood Watch, a Chrome extension that allows users to tag dark patterns on websites and view tags submitted by others. This system promotes more conscientious browsing and reduces susceptibility to dark patterns. Despite some limitations, the study concludes that crowdsourcing can effectively protect users from manipulative interfaces.
Dark patterns in user interfaces have attracted global attention from various disciplines. This study highlights a meta-level issue: demographic biases in DP research. It examines the origins of published research and participant demographics, revealing a bias favoring English-speaking North America and Europe. Addressing these biases is crucial for ensuring inclusivity and rigor in the field.
Commercial health apps have become more accessible and popular, serving purposes such as enhancing health literacy, enabling continuous health tracking, and facilitating community engagement. However, concerns have arisen about the privacy, commodification, and exploitation of data generated by these apps. Less is known about deceptive design patterns and coercive practices from the users' perspective. This study uses pregnancy tracking apps as a case study and presents preliminary findings on user experiences. We argue that health apps require a nuanced consideration of deceptive design practices because (1) these patterns can uniquely intersect with users' vulnerabilities in the health context, and (2) the implications can extend beyond financial losses and privacy invasion, impacting users' health and well-being.
Privacy dark patterns are design tactics used by online services to reduce users' online privacy. These patterns either facilitate institutional data collection or increase others' access to personal data. This study examines how social networking sites popular with teens—Snapchat, TikTok, Instagram, Twitter, and Discord—use these tactics to steer users into reducing their social privacy. We analyzed recordings of account registrations, settings configurations, and logins/logouts for each SNS. Our content analysis identified two major dark pattern types—Obstruction and Obfuscation—and seven subtypes. We discuss why social media companies promote social sharing through design and the challenges of regulating these privacy dark patterns.
The use of persuasive designs to influence user behavior is now ubiquitous in digital contexts, giving rise to ethically questionable practices known as 'dark patterns.' While various taxonomies of dark patterns exist, there is a lack of frameworks that address how these designs are embedded not only in user interfaces but also in the functionality and strategy of digital systems. This paper proposes a framework for a Layered Analysis of Persuasive Designs, grounded in Garrett’s five-layer model of user experience (UX) design and Fogg’s Behavior Design Model. The framework identifies a toolkit of 48 design elements that can be used to operationalize problematic persuasion in digital contexts, highlighting the autonomy impact of each element. This framework aims to assist designers and policymakers in identifying and evaluating (potential) dark patterns within digital systems from an autonomy perspective.
The issue of Dark Patterns, or "Deceptive Design," is gaining recognition in literature. However, their widespread presence across various domains complicates interdisciplinary communication and collaboration. Existing taxonomies of these patterns often overlap and address them at different levels of abstraction, hindering cross-domain discourse. This is problematic given the growing evidence of the adverse effects of such designs on users. Additionally, the fine line between manipulative dark patterns and intuitive, protective, and defensive interface designs further complicates the issue. Current taxonomies primarily define patterns but struggle to distinguish between manipulative and benevolent implementations in specific contexts. This work proposes a method to differentiate between these applications by analyzing previously identified patterns for their properties, consequences, and contexts of application. This paper presents our progress toward creating a taxonomy-independent evaluation process for identifying and describing Dark Patterns.
Regulatory responses to dark patterns often depend on expert evaluations of design interfaces to determine if users are being manipulated or deceived. This article unpacks expert assessments of dark patterns used to solicit user consent and argues that regulatory actions should explicitly address whose expertise is being consulted. It concludes by discussing the value of deliberative mechanisms in broadening the range of both experts and expertise modes for identifying, evaluating, and regulating dark patterns.
Dark patterns refer to design practices that trick or manipulate users into making certain choices. One in four internet users encounter dark patterns. This paper examines vital guidelines issued by government commissions or authorities worldwide, including those from the United States, South Korea, India, the European Union, California, Australia, the United Kingdom, Kenya, and Argentina. A comparative analysis of these guidelines highlights national standards, types of dark patterns, and adherence norms. The study reveals minimal enforcement efforts by the relevant authorities to counter dark patterns. It advocates for a global collaboration to establish universal guidelines against dark patterns, overseen by an international authority or commission.
As of January 2024, 5.35 billion people, or 66.2 percent of the world's population, are online. With attention spans reduced to 8 seconds, digital businesses struggle to acquire, engage, and retain users. Many companies use dark patterns—deceptive UI elements that trick users into actions like signing up for services or making purchases. This thesis investigates various dark patterns, such as roach motel, malicious nudging, urgency/scarcity, bait and switch, and confirm-shaming, categorized into “pressure” and “trickery” tactics. While these methods help companies meet business goals, they undermine user trust. To combat this, the thesis proposes developing a Chrome extension to detect and highlight scarcity and urgency dark patterns. This tool aims to raise user awareness and promote a more transparent internet experience.
Internet users are constantly bombarded with digital nudges like defaults, friction, and reinforcement. When these nudges lack transparency, are not optional, or do not benefit the user, they become 'dark patterns', categorized under the acronym FORCES (Frame, Obstruct, Ruse, Compel, Entangle, Seduce). Psychological principles like negativity bias, the curiosity gap, and fluency are exploited to make social content viral, while covert tactics such as astroturfing, meta-nudging, and inoculation are used to create false consensus. The power of these techniques is poised to grow with advances in predictive algorithms, generative AI, and virtual reality. Although digital nudges can be used altruistically to protect against manipulation, their effectiveness remains inconsistent.
As technology advances, the regulation of dark pattern practices has become crucial. These deceptive tactics manipulate users into unfavorable actions, like complicating service unsubscribes or highlighting consent buttons to obscure transparency. The European Union has responded with several legislative acts, including the GDPR, the Digital Markets Act, and the Digital Services Act. However, these regulations often overlap, creating ambiguities and redundancies. This article examines these challenges and proposes solutions, such as harmonization and centralization, to streamline the regulatory framework. The goal is to protect users from manipulative practices and ensure informed decision-making in the digital realm.
This cumulative thesis investigates the intentions behind digital interfaces, focusing on exploitative "dark patterns" that manipulate user behavior. While existing HCI research has identified various dark patterns, this work synthesizes findings to develop comprehensive frameworks and tools for better understanding and mitigating their effects. Through qualitative and quantitative studies, the thesis explores dark patterns in social networking sites (SNS), examines user perceptions and challenges in recognizing these patterns, and contributes to design theory by identifying where dark patterns manifest. The Responsible Design Triangle model is introduced, highlighting the interdependencies between design, user behavior, and guidelines, to promote ethical digital interface design.
This research investigates the dark patterns users encounter when subscribing to or unsubscribing from online services. While previous studies have described dark patterns in digital contexts, this study provides a detailed analysis of such patterns in online subscriptions. By examining ten case studies of sign-up and cancellation processes on streaming platforms and software services, the research identifies deceptive designs and asymmetric efforts through user flow data and visual artifacts. The findings highlight the prevalence and complexity of dark patterns, offering insights for stakeholders, future design standards, and policy recommendations to improve consumer protection in online subscriptions.
This paper examines the role of dark patterns within TikTok, a rapidly growing social media platform. Utilizing principles from behavioral economics and the existing literature on online choice architecture (OCA), the study investigates how TikTok employs dark patterns to engage users and explores the implications for data protection, algorithmic practices, and market dynamics. The paper uses the "walkthrough method" to conduct a case study, detailing the TikTok user experience and identifying potential dark patterns used by the app. It discusses the challenges in distinguishing dark patterns from legitimate commercial practices, especially considering the user impact. Additionally, the paper examines how dark patterns and OCA are addressed in the Latin American legal landscape and proposes next steps for the ongoing debate based on the study's findings.
Over the past two decades, the focus of design patterns has shifted from encouraging best practices to discouraging harmful ones. Dark and deceptive UX patterns that monetize engagement while perpetuating structural inequities are now prevalent. This study uses a visual case study of a childcare worker platform to critically examine these patterns. Through Care Layering, a form of critical documentation, the study highlights how UX patterns, when viewed as culturally-situated resources, reveal both limitations and opportunities in gig work platform engagement. The discussion emphasizes how Care Layering can help designers achieve greater accountability in UX design.
This research paper explores the widespread use of dark patterns in UI and UX design, uncovering the ethical implications of these manipulative practices. Dark patterns are deceptive design elements that influence user behavior for the benefit of designers or third parties. By examining various examples and their impact on user decision-making, the paper emphasizes the importance of recognizing and understanding these patterns to make informed choices in the digital landscape.
Current online contract practices often involve situations where parties do not understand their rights and obligations under these contracts. This article examines and discusses how complex online contracts complicate and sometimes impede people from making strategic, autonomous decisions. It also addresses how legal design approaches can shed light on complexity and foster tackling of dark patterns in online contracting so as to reduce transaction costs, increase legal quality, business sustainability, and competitive business advantage.
This summarises a roundtable on ongoing and emerging consumer risks associated with dark commercial patterns online organised as part of the 99th Session (Part 2) of the Committee on Consumer Policy (CCP) on 6 November 2020. It featured panellists from academia, consumer protection authorities, and a consumer organisation, the Norwegian Consumer Council. It begins with an overview of the main themes that emerged from the discussion, including examples and categories of dark commercial patterns and their defining attributes; evidence of their prevalence online; consumer vulnerability; and tools and approaches available to consumer protection authorities and policy makers to identify and mitigate them. It then provides details of the presentations by each of the panellists, before concluding with suggested next steps.
The white paper examines the ethical issues in popular apps (Android and iOS) in categories including education, gaming, communication, social and dating; used by adolescents. Additional categories of apps, including music and audio, entertainment, and movies & series, were also covered in subsequent parts of the study. Ethical issues regarding the apps were also researched on the other apps such as Twitter, Reddit, Quora, and Google. In this white paper, the author also examines ethical issues associated with four key sections, namely privacy, age- appropriateness, human-in-the-loop, and user interface. In conclusion, ethical considerations in developing and deploying apps for children and adolescents are found to be necessary and cannot be undermined, considering mobile apps' influence on them.
The article reviews recent work on dark patterns and demonstrates that the literature does not reflect a singular concern or consistent definition, but rather, a set of thematically related considerations. Drawing from scholarship in psychology, economics, ethics, philosophy, and law, the authors articulate a set of normative perspectives for analyzing dark patterns and their effects on individuals and society and show how future research on dark patterns can go beyond subjective criticism of user interface designs and apply empirical methods grounded in normative perspectives
The authors present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, they study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information than they would otherwise. They examine these dark patterns for deceptive practices, and find 183 erring websites . They also uncover 22 third-party entities that offer dark patterns as a turnkey solution. Finally, they develop a taxonomy of dark pattern characteristics that describes their underlying influence and their potential harm on user decision-making. Based on these findings, they make recommendations for stakeholders including researchers and regulators to study, mitigate, and minimize the use of these patterns.
With the nascent rise of the voice intelligence industry, consumer engagement is evolving. The expected shift from navigating digital environments by a “click” of a mouse or a “touch” of a screen to “voice commands” has set digital platforms for a race to become leaders in voice-based services. The European Commission's inquiry into the consumer IoT sector revealed that the development of the market for general-purpose voice assistants is spearheaded by a handful of big technology companies, highlighting the concerns over the contestability and growing concentration in these markets. This article posits that voice assistants are uniquely positioned to engage in dynamically personalized steering – hypernudging – of consumers toward market outcomes. It examines hypernudging by voice assistants through the lens of abuse of dominance prohibition enshrined in article 102 TFEU, showcasing that advanced user influencing, such as hypernudging, could become a vehicle for engaging in a more subtle anticompetitive self-preferencing.
In this paper, the authors examine the extent to which common UI dark patterns can be automatically recognized in modern software applications. They introduce AIDUI, a novel automated approach that uses computer vision and natural language processing techniques to recognize a set of visual and textual cues in application screenshots that signify the presence of ten unique UI dark patterns, allowing for their detection, classification, and localization. To evaluate this approach, they constructed CONTEXTDP, the current largest dataset of fully-localized UI dark patterns that spans 175 mobile and 83 web UI screenshots containing 301 dark pattern instances. Overall, this work demonstrates the plausibility of developing tools to aid developers in recognizing and appropriately rectifying deceptive UI patterns.
“How does the end user perceive, experience, and respond to dark patterns?” This is the research question which drives this inquiry. The paper contributes to an increased awareness of the phenomenon of dark patterns by exploring how users perceive and experience them. Hence, the authors chose a qualitative research approach, with focus groups and interviews for this. Their analysis shows that participants were moderately aware of these deceptive techniques, several of which were perceived as sneaky and dishonest. They further expressed a resigned attitude toward such techniques and primarily blamed businesses for their occurrence. Users also considered their dependency on services employing these practices, thus making it difficult to avoid fully dark patterns.
Many tech companies exploit psychological vulnerabilities to design digital interfaces that maximize the frequency and duration of user visits. Consequently, users often report feeling dissatisfied with time spent on such services. Prior work has developed typologies of damaging design patterns (or dark patterns) that contribute to financial and privacy harms, which has helped designers to resist these patterns and policymakers to regulate them. However, there is a missing collection of similar problematic patterns that lead to attentional harms. To close this gap, the authors conducted a systematic literature review for what they call ‘attention capture damaging patterns’ (ACDPs). They analyzed 43 papers to identify their characteristics, the psychological vulnerabilities they exploit, and their impact on digital wellbeing. They propose a definition of ACDPs and identify eleven common types, from Time Fog to Infinite Scroll. The typology offers technologists and policymakers a common reference to advocate, design, and regulate against attentional harms.
This article discusses the results of the authors’ two large-scale experiments in which representative samples of American consumers were exposed to dark patterns. The research also showed the susceptibility of certain groups- particularly those less educated, to dark patterns and identified the dark patterns that seem most likely to nudge consumers into making decisions that they are likely to regret or misunderstand. Hidden information, trick question, and obstruction strategies were shown to be particularly likely to manipulate.
This article analyzes the definition of dark patterns introduced by the California Privacy Rights Act (CPRA), the first legislation explicitly regulating dark patterns in the United States. The authors discuss the factors that make defining and regulating privacy-focused dark patterns challenging, review current regulatory approaches, consider the challenges of measuring and evaluating dark patterns, and provide recommendations for policymakers. They argue that California’s model offers the opportunity for the state to take a leadership role in regulating dark patterns generally and privacy dark patterns specifically, and that the CPRA’s definition of dark patterns, which relies on outcomes and avoids targeting issues of designer intent, presents a potential model for others to follow.
This submission assesses the extent to which extant and forthcoming legislation is equipped to address the pernicious practice of dark patterns through the prism of the digital fairness review.
In light of the regulation of manipulative interfaces by the United States, questions have been raised about the advisability of national or even European regulation of the exploitation of our cognitive biases by designers of digital interfaces. The article looks at the extent then to which the legislation regulates abusive practices, i.e. dark patterns which exploit cognitive biases. Finally, the author proposes that consideration could be given to a principle of the purpose of capturing attention (in particular the collection of attention for specific, explicit and legitimate purposes and the absence of further processing in a manner incompatible with the purposes initially intended).
The author in this paper takes an interesting stance relating to dark patterns and the subject of online privacy- that the current online consent mechanisms do not permit data subjects to think, decide, and choose according to their internal beliefs, therefore impairing essential individual freedoms or capabilities. Cognitive limitations, information overload, information sufficiency, lack of intervenability and lack of free choice are identified as major shortcomings of consent in privacy. Based on these findings, the author proposes a methodology to evaluate old or new design measures to improve consent and reinstall freedoms of thought, decision and choice.
Dark patterns are common in everyday digital experiences, and they present a new challenge to emerging global privacy laws, particularly the European Union (EU) data protection framework and the General Data Protection Regulation (GDPR). The author contends that while there is an apparent lack of legal tools to deal with dark patterns, the current framework can be amended to identify and curb them, especially through a refinement of the requisites for lawfulness of consent and the reformulation of the fairness principle in data protection.
This article outlines and explores the limits of dark patterns- which it describes as the specific ethical phenomenon which supplants user value in favour of shareholder value. It also analyses the corpus of practitioner-defined dark patterns and determines the ethical concerns raised in these examples. Additionally, it identifies the examples which simply fall under a wide range of ethical issues raised by practitioners that were frequently conflated under the umbrella term of dark patterns. At the same time, the researchers acknowledge that UX designers may be complicit in these manipulative or unreasonably persuasive techniques and concludes then, with implications of educating user experience designers and a proposal for broadening research on the ethics of user experience.
In two preregistered online experiments the authors investigated the effects of three common design nudges (default, aesthetic manipulation, obstruction) on users’ consent decisions and their perception of control over their personal data in these situations. In the first experiment (N = 228) they explored the effects of design nudges towards the privacy-unfriendly option (dark patterns) and in the second, they reversed the direction of the design nudges towards the privacy-friendly option, titled “bright patterns”. Through this and overall, findings suggest that many current implementations of cookie consent requests do not enable meaningful choices by internet users, and are thus not in line with the intention of the EU policymakers. They also explore how policymakers could address the problem.
In the past years, regulators around the world found a new focal point in addressing online information asymmetries. This focal point is dark patterns. As this inspired a widespread interest that brings together human-computer interaction, web measurement, data protection, consumer protection, competition law, and behavioral economics – to name a few relevant disciplines – the authors decided to focus this consumer update on this topic. With the help of Cristiana Santos, who is an expert in the conceptualization and detection of dark patterns as privacy violations, they have written a short summary of the research in the field, regulatory concerns, as well as brief critical reflections showing where more attention should be paid.
The research conducted for this study is significant because it shows that dark patterns are prevalent and increasingly used by traders of all sizes and not only large platforms. According to the mystery shopping exercise, 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern and more importantly identified that it was rare to find one dark pattern used in isolation as multiple were often featured on a site. Altogether, it sheds light on the prevalence of dark patterns in the digital world and how a large number of internet players have a hand in this phenomenon.
The world seemingly becomes more consumer-friendly with each generation as businesses take more and more measures to ensure customers are left with a positive experience. Despite our technological advancement and understanding of human nature, it’s still common to see deception in digital products. To understand digital deception (and better define it), we surveyed 536 people to measure and discuss people’s understanding of this matter.
Default decisions-prevalent and influential in areas varying from retirement program designs and organ donation policies to consumer choice. While past research has shown the reason for these no-action defaults mattering due to effort and implied endorsement, there is a dearth of this in relation to reference dependence, i.e. how the default choice can serve as a reference for determining whether the other choices will be positively or negatively evaluated. In the article, the researchers demonstrate how reference dependence can increase the effectiveness of default decisions.
The concept of dark patterns is still ill-conceptualized in the Digital Services Act, the first EU legislative act to tackle dark patterns head-on. The author identifies these concerning aspects to the DSA's prohibition of dark patterns, notably its reference to manipulation as a source of consumer harm. ‘Manipulation’ is not defined in the regulation, and is a new EU legal term. Many philosophers have reflected on the meaning of manipulation and how it manifests in digital environments, and the jury is still out on this question. This article assesses this and related shortcomings, as well as others in relation to the IMCO Committee's proposed amendment of the Unfair Commercial Patterns Directive to better target dark patterns.
The central idea of this working paper is that issues of safety and fairness can no longer be regulated using consumer choice as the primary protection. Instead, consumers need a privacy law that stops harmful business practices before they cause significant harm. Two concepts are explored in this working paper to address both current and emerging data harms: duty of care or best-interests duty and the privacy safety regime. Borrowing concepts from product intervention powers and product safety interventions, the CPRC proposes options that would allow governments and regulators to stop or limit obviously harmful uses of data as well as a process for regulators to proactively restrict and test new harmful practices as they evolve. It concludes that the law needs to require more effort on the part of businesses to assess whether and how they collect, share, and use data that results in fair outcomes for their customers.
Deceptive designs range from those that are ubiquitous and frustrating for consumers to those that are misleading and deceptive and can lead to significant consumer harm. In light of this, this report considers the common types of dark patterns, the impact of dark patterns on consumers, and the next steps businesses, governments, and consumers can take to reduce harm.
This week on Legal Design Thinking IRL, Hannele Korhonen speaks with Marie Potel-Saville. They talk about the dark patterns of the internet, how to identify them and how she’s leveraging legal design to create fair patterns for users. They discuss what dark patterns are, how they affect our everyday lives, how to spot them, why they are bad for businesses, what laws and international standards govern dark patterns, what fair patterns are, how legal design can create fair patterns, why a human-centric approach to users, UX and their interactions with your business is crucial, and how to report dark patterns.
The purpose of this open letter is to help businesses understand and comply with their existing obligations under consumer protection law when making urgency claims (for example, countdown timers, scarcity or ‘act fast’ messages) and/or price reduction claims online. The letter includes examples where common claims made by online businesses to consumers during the shopping process may breach the law, for example by misleading consumers or by putting unfair pressure on them.
This paper discusses the current CMA thinking in this important area for its ongoing programme of competition and consumer enforcement. It also explains why online choice architecture is relevant to both consumer protection policy and competition policy and outlines a novel taxonomy of practices.
HCI research has extensively studied nudging user behaviour and how corporations have often used this as an avenue to influence user information disclosure. In this paper, the researchers test the effect of norm-shaping design patterns on information divulging behaviour. Primarily their findings indicate a key mechanism by which norm-shaping designs can change beliefs and subsequent disclosure behaviours.
In this paper, the authors present a novel solution expanding the Advanced Data Protection Control (ADPC) mechanism to bridge current gaps in user data and privacy control. Their solution moves the consent control to the browser interface to give users a seamless and hassle-free experience, while at the same time offering content providers a way to be legally compliant with legislation. Through an extensive review, they evaluate previous works and identify current gaps in user data control. They then present a blueprint for future implementation and suggest features to support privacy control online for users globally.
Addressing the root causes of (un)sustainability entails fundamentally changing our ways of living. This requires going beyond technology and behaviour-oriented approaches common under the umbrella of sustainable development (SD). More fundamental change is required to increase the possibility of realizing ecological and psychological well-being. Here, such change is conceptualized as ‘characterological change’. Next to SD another domain is introduced: characterological development (CD). The potential role of design-interventions in CD is explored in this article. Two studies were conducted, a literature study and experts interviews, covering the fields of Design for Sustainable Behaviour, Persuasive Technology, Practice-Oriented Design and Philosophy of Technology. The literature study shows that current research and interventions predominantly fall within the domain of SD, leaving character and related notions largely unaddressed.
Online services pervasively employ manipulative designs (i.e., dark patterns) to influence users to do different things. In this article, the researchers investigate whether users were aware of the presence of dark patterns and if so, their ability to resist them. The researchers discover however, that being aware does not equip users with the ability to oppose such influence. They further find that respondents, especially younger ones, often recognise the ”darkness” of certain designs, but remain unsure of the actual harm they may suffer. Finally, they discuss a set of interventions (e.g., bright patterns, design frictions, training games, applications to expedite legal enforcement) in the light of their findings.
Online vendors often employ drip-pricing strategies where mandatory fees are displayed at a later stage in the purchase process than base prices. In this article. the researchers discovered after thorough analysis that disclosing fees upfront reduces both the quantity and quality of purchases. At the same time, detailed click-stream data analysed by the authors show that price shrouding makes price comparisons difficult and results in consumers spending more than they would otherwise.
This research reveals how dark patterns work, namely which vulnerabilities and biases they exploit. From a broader perspective, it would also allow readers to understand how techno-regulation (i.e. regulation through technology) can nowadays be used to influence individuals’ behaviour and autonomy through design. It examines the existing literature of dark patterns and acknowledges how dark patterns exploit biases, heuristics and vulnerabilities as well as the economic reasons behind dark patterns.
This paper focuses on persuasion and user autonomy education. With the rise of persuasive features in interactive systems which are aimed at increasing revenue, gathering user information and maximising user engagement, users’ autonomy has been argued to be an ethical concern within persuasive UX design. Thus, the researchers test a framework to educate design students on the ethics of persuasion from the perspective of user autonomy. Findings showed that following this, their critical attitudes towards persuasive design increased. Based on this among others, they propose future directions for integrating ethics into user experience design.
The Luring Test: AI and the engineering of consumer trust - FTC
Data Privacy and Consumer Protection Practices of Automated Mental Health, Wellbeing and Mindfulness Apps - Centre for AI and Digital Ethics | University of Melbourne
In this Nobel Prize Summit, a workshop on deceptive design is hosted by Transatlantic Consumer Dialogue and the Electronic Privacy Information Center, and the Minderoo Centre for Technology and Democracy, University of Cambridge. This workshop of experts aims to explore how lawmakers and regulators on both sides of the Atlantic are tackling the issue of deceptive design and explore how lessons learned on both shores can help provide global regulatory solutions. Several notable speakers will take part in the workshop, including Harry Brignull, who first coined the term "Dark Pattern" in 2010.
The DITP's Behavioral Sciences department is working with the CNIL to objectivize the impact of the design of cookie banners, the pop-ups that appear when a site is accessed. The challenge: to ensure that citizens' free will is respected, and that regulations on personal data protection are effective. To shed light on the effects of dark patterns on users, studies are based on behavioral sciences. A study involving over 4,000 people was carried out on cookie banners, and the results of this experiment "confirm the considerable impact of banner design on the choices made by Internet users".
Critical designer, researcher and award-winning artist Caroline Sinders has created a playful website to show how companies deliberately make it difficult to unsubscribe from their services. She conducted her experiment on 16 well-known online services (Amazon, Google, Netflix and the New York Times). A total of 20 different dark patterns were identified across all the applications tested, with impressive consequences: $330 lost and almost 1 hour spent trying to unsubscribe.
Dark patterns and sludge audits: an integrated approach - Cambridge University Press
The National Assembly and the Korean government are actively addressing the issue of “dark patterns” in the online space. Momentum to regulate such deceptive online practices is anticipated to only grow in the coming months.
Two professors from Northeastern University plan to to conduct a study at Northwestern on dark patterns that are present in AI-enabled consumer experiences.
noyb filed three complaints against Fitbit in Austria, the Netherlands and in Italy. The popular health and fitness company, acquired by Google in 2021, forces new users of its app to consent to data transfers outside the EU
Discover the factors driving the integration of dark patterns in Conversational Agents (CAs). The qualitative study reveals six key drivers, shedding light on the subtle tactics that impact user autonomy in digital interactions.