Podcast
5 min read

Cindy Cohn on Dark Patterns and Digital Freedoms

Published on
February 14, 2024

In the latest episode of the "Fair Patterns: Regain Your Freedom Online" podcast, we delve into the crucial conversation around digital freedom, privacy, and the pervasive influence of dark patterns in our online experiences. We had the privilege of welcoming Cindy Cohn, the Executive Director of the Electronic Frontier Foundation (EFF) and a towering figure in the battle for privacy and civil liberties. Known for her unwavering commitment to safeguarding our digital rights, Cindy has been at the forefront of pivotal legal battles that have shaped the landscape of digital freedom today.

During our conversation, Cindy sheds light on her journey with EFF, the evolution of digital rights issues from niche concerns to central aspects of our daily lives, and the ongoing challenges posed by the surveillance business model and dark patterns. Her insights offer a compelling perspective on the importance of informed decision-making and the role of legal and policy frameworks in protecting our digital freedoms. If you're interested to learn more about Cindy's perspective, check out her podcast - How to Fix The Internet.

Transcript

Marie Potel-Saville  

Hello, everyone, and welcome to this new episode of fair patterns regain your freedom online. We're particularly lucky today to welcome Cindy Cohen, Executive Director at the Electronic Frontier Foundation, one of the most influential Lawyers in America with a particularly strong commitment to defending privacy and civil rights for decades. Interestingly enough, a journalist once said If big brother is watching, the better look at for Cindy Cohen. She was also described as rushing to the barricades wherever freedom and civil liberties are at stake. Cindy, welcome to fan patterns podcast, you're quite at home here.

Cindy Cohn  

Oh, thank you so much.

Marie Potel-Saville  

Thanks a lot. I guess my first question is, you know, given the amazing career that you had, that you've had, as a lawyer in the US, you're clearly part of the movers and shakers of the legal arena. And the Electronic Frontier Foundation is a leading nonprofit organisation defending civil liberties in the digital world. Can you tell us a bit more about your background and the work you do at EFF?

Cindy Cohn  

Sure, I'm a lawyer, and I got lucky enough to do some early internet cases. One of them is called Bernstein versus Department of Justice. And it freed up encryption technology so that we can all have, you know, a little bit of privacy and security when we go online. And honestly, it was fun. And I've kept doing that I joined eff formally in 2000. And I was the legal director for about 15 years. And then I've been the executive director ever since. So I have just been really lucky to get to be part of the fight for you know, digital freedom for a very long time. And you know, honoured to get to lead eff at this time when these issues that kind of started off as tiny little issues that only a few key people knew about or cared about now had become centre to all of our lives, you know, it's very difficult, and very few people do live a life that doesn't involve digital technologies at one point or another

Marie Potel-Saville  

Exactly what you said at the time, you know, trying to, to have a little bit of privacy and a little bit of freedom. Yeah, absolutely. This has become a totally essential issue. Right.

Cindy Cohn  

Yeah, I remember the day that, you know, the Internet was mentioned on the in the New York Times, and we all went well, look, we've made it you know, that, of course, is the almost the entirety of the newspaper now, except for the sports. But I think you know, so it's been fun to ride this EFF and the work that I've done, you know, has just ended up being more and more important as time has gone on. You know, at this point. I'm the executive director. So I'm not as involved in a lot of the day to day fights. I'm more kind of helping create a space for the for the other activists and lawyers and technologists at EFF to do their work. And that's very fun. But I do I do get in there a little bit with with issues that are close to my heart.

Marie Potel-Saville  

Fantastic. Precisely, you know, about 30 years ago, when eff was created, as you rightly said, the idea was that protecting access to technology was going to supposedly advance freedom for all, especially with open source, security research, etc. And then, you know, if we fast forward to 2024, it seems that our digital lives are sort of plagued with mass surveillance, dark patterns, addictive interfaces, what went wrong, in your opinion?

Cindy Cohn  

Well, I don't think you found an organisation like EFF like the EFF founders did. If you think everything's magically gonna be right, you don't hire a bunch of lawyers to be in there fighting for people's freedom. If you think that like technology is going to automatically break everybody free. I think that that's a misunderstanding. You know, one of our founders, John Perry, Barlow was saying, Mitch, Capo are some of the early people who created the F F, you know, you would just sit back and drink your margaritas if you thought technology was clean, make everything good. So they built an organisation to help give us a fighting chance of having a more free and a better future. And I would say we've won some things, but we've lost some things. And I would say the biggest thing that that happened that really interrupted a lot of the things that we were trying to build was the surveillance business model. This idea for businesses which governments piggyback on that tracking us figuring out everything we do running it all through systems that try to predict what we're going to Do next such that we are all tracked all the time for everything we do in so many ways that it's difficult to count that idea as kind of becoming central to like the only way you can make money on the internet, which has never been true kind of god embrace by a lot of people. And a lot, you know, and a lot of money made off of this idea, I think that really interrupted a lot of the things that we were doing in the 90s, and really shifted things in ways that I think have made things worse for us. I think that at the end of the day, you know, eff just wrote up a white paper called privacy first, that tries to pull together some of the threads of some of the various different harms and threats that people are facing right now everything from, you know, journalism is struggling to, you know, minors being harmed kids being harmed online, and say, Look, if we start by protecting people's privacy, we're gonna get a good chunk. And by that we mean very specifically ending the surveillance business model and moving things back to, you know, not that you can't have advertising, but that the advertising isn't based on all of this very invasive tracking, we can take a big bite out of a lot of the problems that we are all facing right now. And we ought to really think about that, as opposed to some of the other you know, what I think of, as, you know, feel good solutions that don't really move the needle in terms of protecting people that are being embraced around the world right now.

Marie Potel-Saville  

Exactly. And you know, about the surveillance business model. And exactly, as you said, the fact that we're basically being tracked all the time. Can you tell us how you see precisely dark patterns and deceptive patterns, which are precisely effecting these fundamental rights and values of freedom, privacy? And the fact that yes, we are entitled to our agency, at no point in time, did we waive our rights to decide by ourselves? What we wanted to do online, just, you know, to conveniently buy something online? So can you tell us a bit more how you see these dark patterns and deceptive patterns from from a fundamental rights perspective?

Cindy Cohn  

I think you're exactly right, that freedom and choice and even democracy, you know, it depends on people making informed decisions, and informed is really important there and decisions is really important there. And, you know, what we're, what we've seen is this rise of, you know, a kind of click rap or browse rap or, you know, by looking at this, you agree to blah, blah, blah, with the you know, that this really interrupts the idea of consent, and the idea of knowledgeable decision making, which is at the centre of, you know, not just, you know, contracts, which I think is important, right. You know, the whole point of contracts that I learned in law school is that two people have a meeting of the minds about the deal they want to do, and they agree to that deal. It completely undermines that. Because, you know, most people don't read and really can't read, I think I saw some research that said, you know, like the first four months of the year, you would spend reading all the contracts that you agreed to, to go online, if you really wanted to read every. That's, that's not fair. That's not right. And honestly, it undermines the kind of the central central idea about, you know, us cutting our own deals and exercising agency. So I think that dark patterns, and this kind of trickery that we see all over the internet is, you know, is is extremely dangerous. I also think it undermines this idea that I think people get sold a lot, which is, oh, well, it's your deal. When you go online, you you agree to stuff and so therefore you're bound. And you know, people want this surveillance business model. I hear this all the time, especially in kind of commercial contexts, where people like people want to see ads for the things they want. People want to do these people like the idea that, you know, Google's trying to predict what they want to do next. Now, I think that's, I don't know if you can say bullshit on your podcast. But I think that's a lie. And the reason you can tell it's a lie, is that if that were true, you wouldn't have to sneak it by people by misleading them about what they're agreeing to. So to me, dark patterns are a clear, they're kind of a tell they're a sign that what's going on here isn't really about what people want, because you don't have to sneak by people something they actually want.

Marie Potel-Saville  

Exactly. I couldn't agree more. And also, which you explained about democracy. You know, it's bad enough for contracts and for the supposed meeting of the minds, which obviously is not existent, given all these tricks and traps, but also, what about our democratic rights? What about the habits that we sort of develop by blind signing, blind clicking being manipulated online, and especially in 2024, which is one of the largest election year are in the whole world. So obviously, you've got a lot coming in the US. But I've read, I think it was in the Financial Times that 2 billion people around the world are going to vote this year. And obviously, any manipulation will be extremely dangerous.

Cindy Cohn  

I think that's right. And that's why I care about and I mean, I care about contracts. I'm a lawyer, I care about, you know, Fair Dealing in contracts. I think that's really important. But I think that this civil liberties, freedom, democracy point sometimes gets overshadowed by the business framing of this. And I think it's really important that we recognise that, you know, people need fair free and private access to information in order to exercise their right to vote their right to participate politically, making sure that everything you do online isn't tracked is important for people to have the freedom to decide who they want to vote for, without coercion and without control. And I think that that sometimes gets overlooked as we're talking about this, but I think it's vitally important.

Marie Potel-Saville  

Exactly. And thanks for highlighting that. By the way, the Federal Trade Commission in the US is one of the regulators that takes a really strong stance against dark patterns, and with a number of other regulators in the EU. And that enforcement action is currently resulting in very large settlements, for example, Epic Games had to settle for $520 million further to two lawsuit by the FTC for all the dark patterns that used to be enshrined in the game fortnight TurboTax, was also fined or ended up settling in the amount of $141 million. And of course, last year, the FTC also sued Amazon for prime, which is probably one of the most giant dark patterns in the world online. Do you think that's enough? What is perhaps missing to fully defend citizens fundamental rights to freedom and privacy online?

Cindy Cohn  

Well, I really appreciate the FTCs work. And I think the EU has done very good work with their one organisation against all of these problems all around the world. So, you know, EFF has long been a proponent of what we call private rights of action, that means you as the consumer who's getting tricked, have the ability to sue and get relief from the people who've tricked you, you would think that this was pretty straightforward, right? You shouldn't have rights without having remedies, right? Like, this is a central piece, and you're right, to be able to make informed choices about the products you use should have a remedy that you get, you know, I mean, these these great set, these big settlements are tremendously important. And they can really set the tone for things. But they can't capture all the ways in which we're being tricked, they just don't have those kinds of resources, and they never will. So I think that what needs to happen is the law needs to empower end users to enforce their own rights. And that needs to be a supplement to the things that the big government agencies can do. And I think that would really open up space for you know, the kinds of changes that would make the industry really, you know, stopped just playing whack a mole, where they replaced one dark pattern with another and hope that the regulator doesn't turn their eyes back on them again, or all the other companies who are really small figure, well, they're only going after the big guys, so we can do you know what, nobody will notice us. And, and I think that that's really important. And in the United States, unfortunately, the law is going in another direction, the law is saying that your privacy violations, you know, aren't a solid basis for you to sue except in some some very specific circumstances. And I think that's just wrong. I think if you're misleading people, you should be able to be held accountable, regardless of whether the people being misled, you know, lost money out of their pocket, you know, you'd lost control of your data. And that's important and to be recognised. So we have work to do to get both the courts and, you know, the the legislature's to really free us to protect ourselves. It's always better to protect yourself than to rely on some government entity, you know, to say they're going to protect you. And right now we've got a we've got a failure there. And we need to close that up. Not in all cases, data breach cases, lots of other cases, there are situations in which people have remedies. But I think in the central problem that dark patterns are causing, it's pretty hard for individuals to bring the kinds of lawsuits and at the kind of scale we need to address the size of this problem.

Marie Potel-Saville  

Exactly. And this brings us to my next question, which is exactly that issue of, you know, impact litigation class actions. Clearly the burden shouldn't beyond individual citizens to just try to understand what's what's going on, and then hopefully defending themselves it, but then that that's precisely where NGOs like EFF, which are very active in impact litigation, can really help citizens owning the rights and the NSA can really empower them to defend themselves. I fully understand that, you know, in the US currently, it's difficult to, to launch lawsuits on the privacy angle, but what about manipulation on the sheer you know, commercial side, basically, good half of dark patterns make you either spend more money or buy more than you wanted, you know, tricking you into endless subscriptions sold to you as free delivery, and then you can never you can never cancel them? I mean, all of that. It's, you know, it's bad enough, of course, when when dark patterns affect our privacy, but all of that is equally damageable. Actually, do you think that there's a case here for large class actions?

Cindy Cohn  

I think so I think if you've got people paying more than they otherwise would have, and you can prove it. I mean, it can get complicated to try to prove these kinds of things that we kind of all know. But I think that there is a space for litigation there, there are class action lawyers who are interested in doing those kinds of cases, I think, you know, trying to help build the record so that these things can be brought together. And EFF has done a few of them, but we kind of look for the ones that are going to push the law forward. That's what impact litigation is, but the kind of bread and butter of these kinds of things, which at this point, I think there are quite a few of them. They really, they do exist, people are doing some of these. But I think that the difficulty in connecting the dots in a way that you can show a court that I actually lost money because of the this dark pattern that confused me is still there's still more work that can be done to try to do that and make it clear enough so that you can tell a court of law I mean, it's one thing, you know, I mean, I think it's a good thing about legal systems is that you can't just like say, it's obvious that this is what's happened, you actually have to prove it. And sometimes that takes a little more work and a little more thinking. And I think one of the things that people struggle with this kind of gets to a transparency point is that sometimes it's very hard to know what's going on, it's hard to see what price you might have paid. If you hadn't done this, it's hard to see the ways in which the ad you saw or the things that you got presented with were manipulated by the dark pattern. If you're outside a company, there's a lot of claims of trade secrecy or other kinds of confidentiality that get in the way the kinds of you know, in the United States, we call this the discovery process, the kinds of discovery you have to do to do this is is complicated. I think it's it's, you know, it's not that it's not doable, but it would be great to have a kind of a cadre of people who wanted to help connect the dots so that when the lawyers come in, they can focus on the legal case and not on, you know,

Marie Potel-Saville  

the dates, or the evidence, well, we're actually working on it. So that's a good thing. Yeah. And also, we're working with economics, experts, and, and a number of other experts. So precisely, we believe it's really, really important to establish everything we lose, basically, because and to quantify and to put the evidence to a judge, and then let's let the Justice do its work. Now, of course, you know, I guess we probably have to talk about AI as well. So what do you think that AI is changing as regards dark patterns? You know, hyper personalization, maybe anthropomorphism? Do you see some sort of AI powered dark patterns coming?

Cindy Cohn  

I think so. I also think AI poses a real problem for this transparency piece and for connecting the dots because the way big, you know, machine learning systems work, you know, it's not like you can easily tell why you got the answer you got and so this this transparency problem and connecting the dots problem gets harder in AI legitimately harder, like the builders of these systems don't know, often how, how the how the answer came out, now we can look at the weights and we can look at lots of the pieces of how the model works and was trained to get some inferences there. So but it's a much harder problem. I think in terms of proof and proof of causation. So I think it makes things harder. I also worry that we're going to start to see AI is used in dark patterns in ways that you know, help trick people that make it you can fight back against some of the tricking that happens right now and click wraps and consents and things like that. I think AI is going to make that a tougher thing to do, because things aren't are going to be presented in ways that are more friendly to you that are going to feel better than they are because AIs are for good or for ill one of the things that has been central in the development of AI is is trying to trick people, tricking them into thinking that computer isn't a computer and those kinds of things. So we have a lot of work to do to try to make sure that the instincts that lead us to dark patterns in the pre AI world don't get stronger as we're moving into worlds because, you know, artificial intelligence systems can be really useful, they can do a lot of really good things, I'm very happy when I see that, you know, the Google Maps telling me how long it's going to take to get to a place based upon the inferences that are created by the other cars, right. That's one of the ways machine learning systems are used, and most people are familiar with. But we have to be central to that we have to be in charge of that, instead of the companies that might have, you know, mixed interests and no loyalty to us. I mean, the other way to think about this is, you know, some creating some kind of data loyalty or loyalty requirement for the companies that process our data and provide these things with them. So they have a duty to us to centre our needs and what they're doing. And there's there's proposed legislation and ideas around data loyalty or data fiduciaries that are being developed. And they may be a different way to think about how do we make things more fair.

Marie Potel-Saville  

I love the idea of data loyalty, I also love the work of Eric Waldman on this, he basically, he's one of the advocates have a fiduciary duty on the company's processing our data so that they wouldn't have to do what's good for us, given the power that our data gives them. So yeah, fully agreed. Look, we're coming to the end of this podcast. And our traditional question in the end is always one around a practical tip. Now, maybe just something to say before that is that what we believe at fair patterns is that we should lower the burden for users, meaning that we don't believe that it's for users to be more careful online, we believe that the digital choice architecture should empower them to make their own free and informed choices, obviously, but while limiting their cognitive load. So having said that, do you have some practical tips for any citizens? who faced dark parents? How can they take action?

Cindy Cohn  

I mean, I really appreciate this framing. Because what I usually say is that, you know, what you need to do is you need to stand up in the policy and legal fights, you need to vote for people and support people who support you. Because until we change this policy and legal framework, that individuals are really outmatched, there's not very much that they can do. And I'm, I'm pro technology, I want people to be able to use technology. So so often when I get to this question, which I get a lot, you know, what can I do? You know, people really want you to say just don't use the thing, you know, don't don't don't use social media, you know, go live in a in the woods. Yeah, and I, you know, while I, you know that there's a poll there, um, I wouldn't say there isn't. But I think that, that that's not fair to people, either, right? I think that what we need to be as informed people, we need to complain, when we see dark patterns, we need to turn them in the FTC takes petitions from other people. But I think the biggest thing we need to do is we need to change the legal and cultural landscape in which these things keep coming up. And if you're going to do a thing, I would say, do your thing over there, that those are the kinds of things that that we need to do and everything else. While it's important, you know, pay attention, make sure you show your do your settings. I mean, there, you know, when Apple introduced the setting that says, you know, don't allow companies to track me and reset the default to know as opposed to Yes, on tracking. I think it's still over like 70% of people stuck with the no and didn't say yes. And, you know, so that's the single biggest indicator of where people would be if you gave them the choice, a real choice. So I think we need to continue to push companies to do that make no the default. And yes, the thing you have to turn on, and and make it clear what yes means. And so I think that, you know, pushing on the companies that you care about to do the right thing, letting them know you care. But I just think that in the context of your personal settings on your computer, sure, there are things that you can do, but none of them are going to be the things that we really want to happen. Exactly.

Marie Potel-Saville  

Well, thank you so much, Cindy. It's been an amazing honour and pleasure to talk to you today. Thanks a lot and speak soon.

Cindy Cohn  

Thank you so much. Um, By the way, I have a podcast. It's called How to fix the Internet where we go through some of the ways in which we can build a better future. And we are intentionally they're really trying to get people to think about what the world looks like if we get it right. So what does it look like? If we have fair patterns? What does it look like if people are really informed, and so because there's a lot of gloom and doom on the internet right now, including me there, I'm one of the people who goes out and says things are gonna go horribly wrong. We intentionally created a podcast to try to get people to begin to envision what it looks like if we get it right. So if that's something that attracts, you add other people to think about there's a place where we're trying to gather those ideas.

Marie Potel-Saville  

I absolutely love it. And yeah, that's a fantastic initiative. Thank you so much, Cindy. Thanks a lot for listening today and stay tuned for our next episode.

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name