Blog
5 min read

What you should know about the Digital Services Act

Published on
August 26, 2024

What You Should Know About the EU's Digital Services Act:

The European Union's (EU) commitment to safeguarding its citizens' digital rights took a monumental leap forward with the introduction of the Digital Services Act (DSA). Effective as of August 25, 2023, the DSA has been applicable to all platforms since February 17, 2024. This legislative framework, also known as the EU Digital Services Act, is set to redefine the way digital platforms operate across Europe.

The DSA regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.

At its core, the DSA aims to create a safer, more transparent online environment by holding platforms accountable for the content they host and the services they provide. Its main goal is to prevent illegal and harmful activities online and the spread of disinformation.

One of the most significant aspects of this regulation is its stringent stance against dark patterns, which are deceptive design practices that manipulate users into making decisions that benefit the service provider, often at the user's expense.

As of 17 February 2024, the DSA rules apply to all platforms. Thus, businesses, consumers, and digital platforms alike need to understand its implications. This article delves into the key provisions of the DSA, with a particular focus on its prohibition of dark patterns, and examines the legal actions already initiated by the European Commission against major digital players for violating these rules.

Understanding the EU Digital Services Act

The DSA is a comprehensive piece of legislation that aims to modernize the legal framework governing digital services within the European Union. It was introduced as part of the broader Digital Services Package, which also includes the Digital Markets Act (DMA). Together, these laws seek to create a safer and fairer digital space for users across Europe.

Objectives of the Digital Services Act

The DSA is built on several key objectives:

  1. Ensuring Safety and Accountability: The DSA imposes strict obligations on digital platforms to remove illegal content swiftly and effectively. This includes content that incites violence, hate speech, disinformation and other forms of illegal activities.
  2. Enhancing Transparency: Platforms must provide clear information about their content moderation policies, algorithms, and advertising practices. Users should be aware of why they are being shown certain content or ads and how their data is being used.
  3. Protecting Fundamental Rights: The DSA aims to protect users' rights to privacy and freedom of expression. It also includes provisions to safeguard children and other vulnerable groups from harmful content.
  4. Prohibiting Dark Patterns: Article 25 of the DSA specifically targets the use of dark patterns: “Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.” Beyond the jargon, this is a clear prohibition of any design that significantly affects users’ ability to make free and informed choices.

Scope of the Digital Services Act EU

The EU Digital Services Act applies to a wide range of digital services, including:

  • Intermediary Services: Such as internet access providers and domain name registrars.
  • Hosting Services: Including cloud services and web hosting providers.
  • Online Platforms: Social media platforms, online marketplaces, and app stores.
  • Very Large Online Platforms (VLOPs): Platforms with a significant impact on the digital ecosystem, with more than 45 million users per month, such as AliExpress, Amazon, Apple, Aylo, Booking.com, Google, Infinite Styles (Shein) , Linkedin and Meta.

The DSA imposes stricter obligations on VLOPs, given their size and influence.

Dark Patterns: A Key Focus of the Digital Services Act Europe

97% of European sites contain at least one dark pattern, according to a European Commission study in 2022. These manipulative design practices exploit cognitive biases to steer users toward actions they might not have taken if fully informed. Common examples of dark patterns include:

  • Hidden Costs: Adding additional charges during the checkout process without clear disclosure.
  • Forced Continuity: Making it difficult for users to cancel subscriptions.
  • Bait and Switch: Promising one outcome but delivering another after the user has committed.
  • Privacy Zuckering: Tricking users into sharing more information than they intended by hiding privacy settings.

The Digital Services Act EU explicitly prohibits these practices, recognizing them as a significant threat to user autonomy and trust. The legislation mandates that digital platforms design their services in a way that is clear, transparent, and fair. This means that users should be able to make informed decisions without being subjected to manipulative tactics.

Legal Actions Against Dark Patterns Under the Digital Services Act

Since the DSA's introduction, the European Commission has wasted no time in taking action against digital platforms that continue to employ dark patterns. Several high-profile legal proceedings have been launched, sending a clear message that the EU is serious about enforcing these new rules.

Case Study 1: The Battle Against “blue checks” on X

One of the first legal actions under the Digital Services Act Europe was against X, opened by the European Commission on 18 December 2023. The proceedings focus on 4 main issues:

  • dissemination of illegal content in the EU, notably in relation to the risk assessment and mitigation measures adopted by X to counter the dissemination of illegal content in the EU. The proceedings also relate to the  functioning of the “notice and action” mechanism for illegal content in the EU mandated by the DSA, in light of X's content moderation resources.
  • lack of effectiveness of measures taken to combat information manipulation on the platform, notably the lack of effectiveness of X's so-called ‘Community Notes' system in the EU and the lack of effectiveness of related policies mitigating risks to civic discourse and electoral processes.
  • poor measures taken by X to increase the transparency of its platform. The investigation concerns suspected shortcomings in giving researchers access to X's publicly accessible data as mandated by Article 40 of the DSA, as well as shortcomings in X's ads repository.
  • suspected deceptive design of the user interface, in relation to checkmarks linked to certain subscription products, the Blue checks that signal that a given user has a subscription to X Premium.

On 12 July 2024, the Commission sent its preliminary findings that X did breach the DSA and this time dark patterns were the first concern: the Commission noted that X designs and operates its interface for the “verified accounts” with the “Blue checkmark” in a way that does not correspond to industry practice and deceives users. Since anyone can subscribe to obtain such a “verified” status, it negatively affects users' ability to make free and informed decisions about the authenticity of the accounts and the content they interact with. There is evidence of motivated malicious actors abusing the “verified account” to deceive users.

The Commission also found that  X does not comply with the required transparency on advertising, as it does not provide a searchable and reliable advertisement repository, but instead put in place design features and access barriers that make the repository unfit for its transparency purpose towards users. In particular, the design does not allow for the required supervision and research into emerging risks brought about by the distribution of advertising online.

The third breach relates to the fact that X prohibits eligible researchers from independently accessing its public data, such as by scraping, as stated in its terms of service.

If the Commission's preliminary views were to be ultimately confirmed, the Commission would adopt a non-compliance decision finding that X is in breach of Articles 25, 39 and 40(12) of the DSA. Such a decision could entail fines of up to 6% of the total worldwide annual turnover of the provider, and order the provider to take measures to address the breach.

Case Study 2: Dark Patterns in Online Marketplaces Shein and Temu

Another notable case involves two leading online marketplaces: on June 28, 2024, the European Commission sent a request for information to each of Shein and Temu to provide more information on the measures they have taken to comply with the DSA obligations.

The request relates to 6 main issues:

  • dark patterns: interfaces that deceive or manipulate users
  • the so-called ‘Notice and Action mechanism', which allow users to notify illegal products
  • the protection of minors,
  • the transparency of recommender systems,
  • the traceability of traders and
  • compliance by design.

These marketplaces had until July 12 to provide their answers. The Commission could decide to open formal proceedings.

The Broader Impact of the Digital Services Act on the Digital Landscape

The Digital Services Act Europe is expected to have a profound impact on the digital landscape, affecting how platforms operate and how users interact with digital services. By prohibiting dark patterns and imposing strict transparency requirements, the DSA aims to create a digital environment that is safer, fairer, and more trustworthy.

Implications for Digital Platforms

For digital platforms, the DSA presents both challenges and opportunities. On one hand, platforms will need to invest in compliance measures to ensure they meet the new legal requirements. This may involve auditing their user interfaces to identify dark patterns, redesigning user interfaces to avoid manipulation and deception, updating content moderation policies, and providing more transparency around algorithms and data usage.

On the other hand, the DSA also offers an opportunity for platforms to build trust with their users. By prioritizing transparency and user control, platforms can differentiate themselves in a competitive market and attract users who value ethical practices.

Implications for Consumers

For consumers, the European Digital Services Act promises a safer and more transparent online experience. Users will have more control over their data, greater insight into how algorithms influence their online experience, and protection from manipulative practices like dark patterns. The DSA also enhances the protection of vulnerable groups, such as children, by requiring platforms to take additional measures to safeguard them from harmful content.

Implications for Businesses

Businesses that rely on digital platforms to reach customers will also be affected by the DSA. For example, online marketplaces will need to ensure that their pricing and advertising practices comply with the new rules. Similarly, businesses that use social media for marketing will need to be aware of how the DSA's provisions on transparency and dark patterns might impact their campaigns.

Overall, the Digital Services Act Europe is likely to lead to a more level playing field, where ethical business practices are rewarded, and consumers are better protected.

Preparing for the Future: What Businesses Should Do

With the Digital Services Act EU set to come into full effect, businesses and digital platforms need to take proactive steps to ensure compliance. Here are some key actions that companies should consider:

  1. Conduct a Compliance Audit: Review your current practices to identify any areas that might violate the DSA's provisions, particularly those related to transparency and dark patterns. Good news, our dark patterns audit is up and running!
  2. Redesign User Interfaces: If your platform currently employs any dark patterns, it's essential to redesign these elements to align with the DSA's requirements. This might include simplifying the opt-out process for subscriptions, clearly disclosing all costs upfront, and making privacy settings more accessible. Wondering what a fair pattern might look like? Have a look at our fair pattern library and get in touch to have your own, easily implemented fair patterns on your sites and apps.
  3. Enhance Transparency: Provide users with clear information about how their data is being used, how content is moderated, and how algorithms influence what they see. This transparency can help build trust and ensure compliance with the DSA.
  4. Monitor Legal Developments: Stay informed about ongoing legal actions and rulings related to the DSA. Understanding how the law is being enforced can help you anticipate potential risks and adjust your practices accordingly.
  5. Engage with Regulators: If you're unsure about how certain aspects of the DSA apply to your business, consider engaging with regulators for guidance. This proactive approach can help you avoid potential legal issues down the line.

Conclusion: A New Era of Digital Responsibility

The Digital Services Act EU marks a new era of digital responsibility in Europe. By prohibiting dark patterns and enforcing strict transparency requirements, the DSA aims to create a digital ecosystem that prioritizes user rights and ethical practices. As the European Commission continues to take legal action against platforms that violate these rules, it's clear that the DSA will have a lasting impact on the digital landscape.

For businesses and digital platforms, the DSA presents both challenges and opportunities. While compliance will require significant effort, it also offers a chance to build trust with users and stand out in a competitive market. As the digital world continues to evolve, the European Digital Services Act will play a crucial role in shaping a safer, fairer, and more transparent online environment for everyone.

By staying informed and proactive, businesses can navigate this new regulatory landscape and contribute to a digital future that benefits all stakeholders.

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name