Podcast
5 min read

Gabriel Voisin and Alexandre Vuchot on Are Dark Patterns Illicit.

Published on
June 13, 2023

✅ Is there one single legal definition of dark patterns?

✅ What are the legal risks incurred by companies from the consumer law and the data protection point of views?

✅ What is really changing with the digital service act (DSA), the first regulation that explicitly prohibits dark patterns per se?

To answer these questions, Marie Potel welcomes Gabriel Voisin and Alexandre Vuchot, partners at Bird & Bird specializing respectively in data protection, cyber security & privacy, and consumer law.

You will also hear an extract of Tristan Harris talking about technological deception in the social media age in front of the US congress on January 8, 2020.

To go further:

Have a question or a need for support? Go to fairpatterns.com or contact Marie on LinkedIn!

Transcript

Marie Potel-Saville 

Hello, and welcome everyone to a new episode of our podcast fighting dark patterns regain your freewill online. As you probably know by now, it's been a couple of episodes, Dark patterns are interfaces that deceive or manipulate you online to make you either act against your preferences or even against your interests. A sadly classic example, as I'm sure you've seen everywhere when you navigate, is the big fat green button saying “I agree” on which you click without thinking and of course without reading, which means that you actually did not agree to anything, of course. Today, we'll be diving into a daunting question, given that dark patterns are everywhere. The question is, are dark patterns illicit? I was actually having fun a couple of days ago, clicking on an article on dark patterns written by a very famous law firm, which is not the one our guests work for. And so I was seeing this huge multinational law firm writing an article on dark patterns so I was like : Oh, that's interesting, let's click on this article. And the very first thing I saw, while clicking on this article was their cookie banner containing a huge dark pattern for the "I agree" button, just below the title of the article which was something about the risks the very serious legal risks caused by dark patterns. So that was quite funny, actually. Anyway, that's just an anecdote to introduce our topic today. With me today are two amazing lawyers with whom I've been collaborating very nicely and very successfully for a number of years now. I have the great pleasure of welcoming Gabriel Voisin, who's a Partner at Bird and Bird specializing in data protection and cyber security and privacy, along with Alexandre Vuchot, who's also a partner at Bird and Bird at the Paris office specializing in consumer law. So welcome you both. It's a real pleasure to have you on our podcast today.

Gabriel Voisin 

Hi, everybody.

Marie Potel-Saville 

So I guess my first question is, according to you, as lawyers, as partners at Bird and Bird, what is the legal definition of dark patterns? 

Alexandre Vuchot 

Maybe I can answer that question, Marie. So, actually there's not one single legal definition of a dark pattern. It is a different set of different practices, which encompasses different sets of regulations. I can name a few EU regulations. For instance, the Unfair Commercial practice directive or the consumer rights directives with the E commerce directive are trying to define different sets of behaviours, rights, obligations, which can be imposed on websites or platforms. They set the rules that apply or does not apply, what should be done or should not be done. So it can be protection for the consumers or things which could be forbidden to the site. If you take the Unfair Commercial practice directive for instance, the definition could be “every manipulation of the behaviour”, which can actually be something that goes with the misleading practice or even to aggressive practice, equally to harassment or undue influence. So, depending on the severity of the behaviour, it can have different sets of regulation. And the consumer rights directive will provide warranties to the consumer, whereas the E commerce directive will focus on ensuring that the consumer has pre contractual information or right to withdraw or making sure that they have the right information and no hidden information. The unfair contract terms directive would declare some terms unenforceable in cases where they would distort the consent of the consumer. I can name also maybe the platform to Business regulation, which gives some information about rankings and how different services should be ranking information to consumers. So, in a nutshell, it's very difficult to give one single legal definition, it's more a different set of regulation, depending on what is the objective of the regulation from data, consumer, competition, or even media regulation. 

Marie Potel-Saville 

Sure, I guess we can probably identify some commonalities in the definitions. To be honest, it's a work that we've done internally in our r&d lab, because as you very rightfully said, it's a combination of legal grounds, and therefore different definitions, which does not really help companies wondering whether they have any dark patterns on their apps or sites or even consumers or just citizens. So perhaps it would be interesting to suggest a number of common points, for example, it's just a couple of hints that we could give to our listeners : manipulation, deception, altering autonomy, altering the architecture of choice, and also ending up acting in a way that's different from your own preferences, or that's even against your interests, of course. Does that make sense at all Alexandre ?

Alexandre Vuchot 

Absolutely. It's really about something that you wouldn't have done had this dark pattern not been put in place. So yes, there's really distortion and abusing about the situation. Absolutely. 

Marie Potel-Saville 

Exactly. Thank you. Thanks a lot for that. So I guess, you know, quite naturally, the next question is, are dark patterns illicit? And if so, on which ground exactly? And once we dive into the legal grounds, of course, then, you know, we're also wondering about the legal risks. So that's a question to both of you, of course. 

Alexandre Vuchot 

Maybe I will start on the most obvious one for consumer law : consumers. From consumer law perspective there is different types of rights the consumer could use against the site. Also, regulators can impose fines for breaching the law. But more interestingly, also, it could be used by competitors trying to take action against the site using dark patterns to get a competitive advantage. So there's really three angle from a risk from consumer law perspective : the consumer itself, the regulators, or the competition, authority, or consumer authority, and the competitors. So it can go from fines, damages, or just rescission or termination of the contracts. I guess Gabriel will comment on data protection which is also a hot topic obviously. 

Gabriel Voisin 

Indeed Alexandre. Because when there is processing of personal data, such as the name of an individual, then in Europe, we have the so called GDPR. And since the entry into force of this legislation back in 2018, there have been some quite tangible requirements that make dark patterns illicit taking two examples. For instance, if you try to obtain the consent of the individual for the processing of his data, then you must make sure that this consent has been freely given, is specific, is informed and unambiguous. If you don't have that, because for instance, you've developed a technique to try to seduce the user and have a sort of a so-so consent out of the individual, then you fail those requirements and you fail GDPR. Equally, under the same legislation, there is a requirement to make sure that you say to people how you're going to process their data. This information you provide to them must be concise, transparent, intelligible, and in clear and plain language.

Marie Potel-Saville 

Exactly. It's even a plain language requirement, which is rarely the case, right?

Gabriel Voisin 

Rarely, indeed, and that's why suddenly the GDPR pave the avenue for dark pattern enforcement very nicely when things are not done by the book. When you look at the total amount of fines issued by data protection authorities across Europe since the entry into force of the GDPR in 2018, we are now at the full billion euro mark, for the cumulated fines. 

Marie Potel-Saville 

It's so striking to see the number of provisions in the GDPR which actually are directly prohibiting dark patterns, even though they're not naming dark patterns as such. But if you take, for example, the overarching principle of fairness in article 5 of GDPR, I mean, there's obviously nothing fair in dark patterns. And so we've got this very strong, overarching principle, as you very rightly said: Where is the freely given informed specific consent when you actually responded to confirm shaming? And for the listeners who are not necessarily familiar with all the names of dark patterns, confirm shaming is a trick where the interface is trying to influence your choice by naming the button on which they don't want you to click in a way that will make you feel bad about clicking on it. So just very practically, you know, it's the button "Oh, no, I don't want to find my friends" or "I don't want any friends" and of course, no one wants to feel stupid and click on that button. And of course, it's to make sure you share evermore personal data. So yeah, Gabriel I'm sure you've analyzed the various provisions in the GDPR which actually are against dark patterns. And there's also, you know, beyond Article 5, there's also this clear and concise information at article 12, up to actually data protection by design, right?

Gabriel Voisin

Correct. Yes, it's really this idea behind the privacy by design approach that, from the time of the creation of your new idea, the new website, the new app, you really put yourself in the shoes of the users behind these ideas, this tool, this app, and you try to make their environment, their experience as privacy friendly as possible and you don't try to trick them. Because it's all about fairness. It's all about making sure that this is a win for you and a win for them.

Marie Potel-Saville 

Exactly, exactly. And I guess that fairness by design might have been slightly forgotten or overshadowed in the way interfaces have been created for a long time now. I guess it's a good time now to listen to someone who's extremely interesting when you think about the attention economy, even, you know, surveillance capitalism, as put by Shoshana Zuboff, famously in her amazing book. I'm talking about Tristan Harris, of course, who's the founder of the Humane technology Think Tank, and we found one of his declarations to the US Congress. It's not new but we find it absolutely significant and important. And so we're going to listen to that and have your reactions just afterwards.

Tristan Harris

I'm going to go off script. I come here because I'm incredibly concerned. I actually have a lifelong experience with deception and how technology influences people's minds. I was a magician as a kid so I've started off by seeing the world this way. And then I studied at a lab called the Stanford persuasive Technology Lab, actually, with the founders of Instagram. And so I know the culture of the people who build these products and the way that it's designed, intentionally, for mass deception. I think there's a thing I most want to respond to here is, we've often framed these issues as : we've got a few bad apples, we've got these bad deep fakes, we got to get them off the platform, we've got this bad content, we've got these bad bots. What I want to argue is we have Dark infrastructure. This is now the infrastructure by which 2.7 billion people, bigger than the size of Christianity, make sense of the world. It's the information environment. And if someone went along, private companies, and built nuclear power plants all across the United States, and they started melting down and they said, well, it's your responsibility to have hazmat suits and built, you know, have a radiation kit. That's essentially what we're experiencing now. The responsibility is being put on consumers, when in fact, if it's the infrastructure, it should be put on the people building that infrastructure. 

Marie Potel-Saville 

So we just heard Tristan Harris explaining to you the US Congress, how some interfaces that are actually designed intentionally for mass deception, and he's even talking about a Dark infrastructure where billions of people are actually manipulated online. Our take on this at fair patterns, is that precisely, it's not doomed to be this way. And we're very much inspired by Shoshana Zuboff, again, in her excellent book, surveillance capitalism, where she explains that when a number of digital services were invented, it was actually not meant to be intentionally deceptive or manipulative. It was meant for the people by the people. And that's really what triggered us at fair patterns to actually create the concept of fair patterns, and also try to help resolving the issue of manipulation online, because we do not believe that it's an intrinsic feature of digital. We do believe that it's possible to have a human centric digital economy, where actually humans can thrive. Gabriel, what's your take on that? Are we dreamers?

Gabriel Voisi

No. I think you live in 2023 and as you said, this quote from the speaker is very interesting, because it, it does suggest that, yes, platforms can be better. Does this mean at the same time that all platforms are bad? Not really, I think if we were to take an example, Wikipedia is a good example of that sort of a positive platform, positive design, we want to see. Where, yes, of course, regulation has a word to say on the way the platform operates, but at the same time, we see the users of such platform being given an opportunity to co contribute and elaborate content and interfaces in the context of the Wikipedia community. So I think that is giving us hope, and is certainly a model that we'll be looking at.

Marie Potel-Saville 

This is so interesting that you're mentioning co design and participatory design. I was actually lucky to take part in a panel at CPDP a couple of weeks ago in Brussels. The panel was organized by ICO, someone you know I think, Clara Clark Nevola, and she had invited the OECD but also Google to discuss precisely, you know : have you tried asking?  have you tried including citizens into either policymaking or even product and service making?  And again, the example you gave is, is a very good example of that. And it's really possible to co-create, to co-design interfaces and services and even policy that really works for the users for which they are intended. So definitely some active optimism on this front. So back to our topic, which is very legalistic today, and that's perfectly fine. But what is really changing with the Digital Services Act, given that we have all these existing legal grounds? What's the relationship so to speak between the DSA and GDPR and UCP? What applies to what when there's a dark pattern at stake? 

Gabriel Voisin 

Yes, Marie, very good question. This one is definitely the latest legislation on the book that we have now received from Brussels, and forms part of the EU legislative framework. This legislation is gradually making its way into Live mode, it will apply fully in the context of 2024. But it will only apply to some organizations as opposed to for instance the GDPR, which applies to every organization as soon as they process personal data. In the case of the Digital Services Act, we are looking at only a specific number of players, in particular, the so-called online platforms, which under Article 25, we see for the first time, an explicit prohibition not to design or organize their online interfaces in a way that deceives or manipulates the users of such online platforms.

Marie Potel-Saville 

Yeah, so this is the very first prohibition of dark pattern per se in European law? Right? 

Gabriel Voisin 

Absolutely. Although, as you noted a bit earlier, here, again, we are not seeing the word dark pattern being explicitly used. There seems to be a reluctance from our lawmakers to use that bad word. But yes, it is clearly the idea of dark patterns that we try to go after via this article 25. But, only for certain actors: the online platforms, and so long that the practices being singled out, is not already regulated or dealt with via the Unfair Commercial Practices Directive from 2005 or the GDPR from 2018. So, there will be some articulation to make before we can say this applies. 

Marie Potel-Saville 

Yeah, exactly. So basically, you know, if we're saying that the DSA only applies if UCPD, or the GDPR is not applicable, then it means that any dark pattern affecting consumers is not caught by the DSA, any dark pattern that privacy related purely is already caught by GDPR. So I was just, you know, trying to imagine what kind of dark pattern with actually be caught under the DSA?

Gabriel Voisin

Yeah, really good question. Marie. I think in this question, probably the answer to give an example will be around cookies, because cookies are not directly regulated by the GDPR. They are instead regulated by the so-called E-privacy directive, which is not specifically called out as a carve out in this DSA article and therefore, we could imagine that a practice involving cookies, but not underlying processing of personal data, could be one of the practices caught by this prohibition in the DSA.

Marie Potel-Saville 

So interesting, thanks a lot, it actually makes me think that I've seen a couple of days ago, a decision by an Italian regulator, stating the word dark pattern expressly in prohibition decision. So, I guess, you know, the reluctance that you were mentioning earlier, the reluctance even to name, it's a bit like Voldemort, to name dark patterns in regulation, perhaps, you know, judges and regulators are actually more active now, they feel more empowered to actually resort to that concept. And for the listeners, of course, we'll put the reference of this Italian case, along with the podcast when we publish it. We like to end our podcast with a very, very practical tip. So, in practice, what is the very practical tip that you would like to give to fight against dark patterns? And to whom? Would you like to give that tip? 

Gabriel Voisin 

Well, if you can permit me to give two, I'll go for two.

Marie Potel-Saville 

Of course, we want 10! 

Gabriel Voisin 

Well then let's start with two, which I think should be addressed to the developer community of the corporations. So, number one from a privacy perspective, try to ensure that when a user wants to make use of a privacy friendly option available on your site, it is not more difficult than when he makes a non-privacy friendly choice. Because you know, it's all about this balancing act and being fair on both ends of the pond. So that's number one. 

Marie Potel-Saville 

I love it. And sorry, but just to be to be really clear, if it's possible for a developer to make one option easier to use, then there's really no technical hurdle or anything. And so you're very right. Both options should be as easy and certainly the Privacy friendly option shouldn't be more difficult for sure. Yeah. Great One, love it.

Gabriel Voisin 

And the second one is the idea of privacy by default. I think, unfortunately, is the recent tragic developments we've had in the online space being, for instance, this teenager in France who decided to pass away because of bullying and harassment on social media, emphasize the importance to make sure that we, as adults, protect our kids. And to do that, we need to make sure that when they publish content, when they have this capacity to be part of this online community, they don't do this in a sort of uncontrolled manner, in a manner that we make automatically their content available to the internet at large. And instead, with this idea of privacy by default, we try to limit to the immediate sphere, the immediate plus one network in the online space, rather than trying to make a big splash with everybody from Indonesia, to the US being in capacity to access to their content. I think that's sure a priority and something we could all help with. 

Marie Potel-Saville 

Absolutely. Let's remind here that minors are actually 1/3 of all internet users in the world so it's definitely not a small issue. And you're right, the privacy by default options should be absolutely present everywhere. I know that the Five Rights Foundation has done an amazing work in their latest report that went out in April of this year, precisely on how to avoid dark patterns for kids. So, we will also put the link to the Five Rights Foundation report. Perhaps a practical tip from us as well. We recently developed a sanity checklist; we call it this way. It's basically just a sanity check, very simple things to check when you design an interface. So, this is a practical tip for designers. Just checking that, for example, if you create two buttons for the same action, well, one button is not more salient than the other: the shape and color and form are equivalent, and the text in the button is not trying to influence users’ choices. And by the way, one of the classic objections that we very often hear is: yes, but we need to boost opt ins so if we don't make the I agree button, you know, big fat and green. How are we going to do that? So let me just share that. During CPDP, which I attended a couple of weeks ago, there was a study mentioned in the Netherlands that concluded that contextual advertising is more efficient than targeted advertising. So I'm not sure why we should try to boost opt ins all the time. 

Well, it's been a real pleasure having you both with us today in this episode. Thank you very much to each of you, Alexandre and Gabriel and speak very soon.  Thank you all for listening to this episode: Are dark patterns illicit? Definitely, there was a lot to say. Our next episode will tackle an equally interesting and important issue, which is What are the economic incentives to fight against dark patterns? so definitely it's going to be a fantastic episode. Thank you all for listening. Take care, bye.

Fighting dark patterns, a podcast by Amurabi. For further information, you can go to fair patterns.com

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name