Newsletter
5 min read

Navigating the Age of Deep Fakes: How to Thrive as Humans when the Line Between Fake and Real is Blur

Published on
August 31, 2024

In today’s digital age, the lines between reality and illusion are becoming increasingly blurred. 🌐 With the rise of deepfakes, we’re confronted with a world where it’s getting harder to distinguish what’s real from what’s not.

This raises critical questions: How can we grow and develop when we’re constantly exposed to things that aren’t even real? How do we form genuine interpersonal relationships when AI is - supposedly - designed to fulfill our every need? And how do we protect younger generations from becoming obsessed with illusions that don’t even exist? 🤔

In this newsletter, we’ll explore what deepfakes are and dive into some of the recent findings from the Ofcom report. We’ll examine the potential damage these digital deceptions can cause to our democracies and hear from our friend and Senior Advisor, Dr. Celia Hodent, who kindly agreed to share her insights on navigating this complex landscape.

Let’s unpack these pressing issues and find ways to thrive in a world where authenticity is under threat. 🌍✨

Understanding Deepfakes: The Thin Line Between Real and Fake 🎭

Deepfakes are becoming an increasingly prevalent part of our digital lives. But what exactly are they? At their core, deepfakes are videos, images, or audio clips created using artificial intelligence to mimic real people or scenarios. While this technology can be used for creative purposes, such as enhancing films or creating satire, it also poses significant risks.

Key Findings from Ofcom's Report 📊

According to new research from Ofcom, deep fakes are becoming alarmingly common:

  • Prevalence: Two in five people (43%) aged 16 and older have encountered at least one deep fake in the past six months. This number rises to 50% among children aged 8-15. These deepfakes often involve sexual content, politicians, or scam advertisements.
  • Awareness and Identification: Despite the increasing visibility of deepfakes, only 10% of people feel confident in their ability to spot them. This highlights a significant gap in public awareness and understanding.
  • Potential Uses: While some deepfakes serve positive roles, like enhancing media or assisting in medical treatments, many others are harmful.

According to Ofcom, deep fakes can cause significant damage, especially in three key areas:

  • Demeaning: Falsely depicting individuals in compromising scenarios, such as sexual activities, which can be used for blackmail or coercion.
  • Defrauding: Misrepresenting someone’s identity, which can lead to scams, fake advertisements, or fraudulent activities.
  • Disinforming: Spreading false information widely, which can influence public opinion on crucial political or societal matters, like elections, wars, or health crises.

➡️ Here is an example that got us really impressed, and not in a good way!

https://www.linkedin.com/posts/daniellewkovitz_mild-how-long-will-it-be-before-ai-generated-activity-7232308151432179713-GhtN?utm_source=share&utm_medium=member_desktop

The impact is real, and it can touch democracies all over the world ❌

But, as always there is something to be done 💪

To combat the harmful effects of deep fakes, Ofcom suggests several strategies that tech firms can adopt:

  1. Prevention: Developers can use AI to filter and block harmful content before it's even created, ensuring that certain types of deepfakes are never made.
  2. Embedding: By embedding watermarks or metadata into AI-generated content, it becomes easier to identify and manage.
  3. Detection: Platforms can employ a mix of automated and human reviews to better distinguish between genuine and fake content.
  4. Enforcement: Clear rules and strong enforcement actions can deter users from creating and sharing harmful synthetic content.

🦸 Ofcom is actively working to ensure online platforms adhere to their new responsibilities under the Online Safety Act, focusing on protecting vulnerable groups, including women and girls.

As Gill Whitehead, Ofcom's Online Safety Group Director, stated:

“If regulated platforms fail to meet their duties when the time comes, we will have a broad range of enforcement powers at our disposal to ensure they are held fully accountable for the safety of their users.”

➡️ To learn more about the findings and recommendations, check out the full Ofcom report here.

Insights from Dr. Celia Hodent, PhD, UX strategist and video game expert: Navigating Deepfakes with a Cognitive Edge 🧠

We had the pleasure of speaking with Celia Hodent, an expert in user experience (UX) strategy and cognitive science, particularly in the field of video games. With a PhD in psychology and over 15 years of experience, Celia has significantly influenced UX strategies in major gaming studios like Epic Games, LucasArts, and Ubisoft. She’s also the author of The Gamer’s Brain: How Neuroscience and UX Can Impact Video Game Design. As an independent consultant, she now helps studios create engaging and ethical gaming experiences, and she actively discusses the ethics of the game industry, especially concerning dark patterns and attention economy. And we’re lucky to have her as our very own Senior Advisor!

Celia Hodent

We asked Celia to share her thoughts on the psychological impact of deepfakes and the challenges they raise. Here's what she had to say:

❓ From a psychological perspective, what are the risks for all of us living in a world where we can’t tell what’s real or fake?

Celia: Research is currently exploring the social impact of deepfakes and fake news. What strikes me the most so far is that people stop believing the news and official statements, which will make it much more difficult to make evidence-based decisions for the greater good moving forward, such as decisions regarding public health (e.g., pandemics) or the climate emergency. Trust is an important cement in society, and it's currently being challenged, to say the least.

❓Are deepfakes a new form of dark patterns?

Celia: I would say so, yes, because they are purposely deceiving people (pretending that something is real when it's not), usually to influence their opinions and decision-making. It's not meant with people's best interests in mind but to benefit a lobby, industrials, or a political group.

❓As an expert in both psychology and video games, is there a way to make "fake worlds" realistic but not deceptive or otherwise harmful? Any positive impact of deepfakes?

Celia: Well, if they are in the shape of art (e.g., movies, books, games) or entertainment (e.g., satirical webzines), then yes, it can be positive. Transparency (clearly stating that it's not real) and explicit consent (people are informed that it's fake before they consume the content) are absolutely necessary, though, to avoid being a dark pattern.

Celia highlights a crucial point: the erosion of trust due to deepfakes is a significant societal risk. Her insights suggest that while deepfakes can be harmful, especially when used deceptively, they also have the potential to be used ethically in art and entertainment if accompanied by transparency and consent.

This perspective reminds us that the key to navigating the challenges of deepfakes lies in balancing innovation with ethical considerations. ⚖️

➡️ You can follow her website here: https://celiahodent.com/

Conclusion: Navigating the Digital Maze with Awareness and Integrity 🌐

As we navigate the complexities of a world increasingly influenced by deepfakes, it's crucial to stay informed, vigilant, and proactive. The rise of these digital illusions challenges our ability to discern reality, but it also calls for a collective commitment to uphold truth and transparency.

By fostering awareness, advocating for stronger regulations, and encouraging ethical technology use, we can protect the integrity of our interactions and the foundations of our society. Let's work together to ensure that in a world filled with artificial constructs, the values of authenticity, trust, and ethical practice remain our guiding principles. 🌟

The highlights of this month!

Discover how we can support you!

Want to know more?

You can find all of our news updated on our site www.fairpatterns.com and listen our podcast with some amazing voices helping framing #darkpatterns through different lenses.

Become a client!

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name