Recap Online Event #1: Fair Design and Consumer Protection
Consumer protection in the digital space is often inadequate, as users simultaneously become data providers, feeding into an additional, often primary, business model of digital service providers. A central role in this is played by what’s known as Deceptive Design, something we all recognize from the omnipresent cookie banners. But the issue goes much further than that (ROSSI et al, 2024).
What is Deceptive Design?
Deceptive Design, formerly known as dark patterns, refers to design patterns in digital interfaces that intentionally manipulate users into making certain choices, often against their own interests. This can manifest in many ways: hidden opt-outs, misleading language, confusing menus, or visual tricks like prominently highlighted "yes" buttons. The aim is often to get users to consent to data collection and sharing, subscribe to services, or agree to ad tracking, without fully understanding what they are agreeing to (Norwegian Consumer Council, 2018).
“Deceptive design is about designing decision-making situations, making certain decisions easier than others. It’s about strategic resistance, manipulation – often subtle, but with great impact.” – Judith Faßbender, Alexander von Humboldt Institute for Internet and Society (HIIG) Judith Faßbender, Alexander von Humboldt Institute for Internet and Society (HIIG)
The consequences of deceptive design run deep. People with disabilities, older adults, and individuals with limited access to technical literacy are especially at risk of being systematically excluded by manipulative interfaces.
“Design is always about power. If I influence decisions – like with bigger buttons or hidden options – it can ultimately lead to exclusion.” Eileen Wagner, Designer
Between Structural Exclusion and Design Ethics
Deceptive design highlights the tension between profit-driven logic and ethical design. Often, the pursuit of profit stands in the way of creating inclusive, accessible digital environments.
“Some cookie banners aren’t even usable with screen readers. That means I can’t access the page at all. That’s not just bad UX, that’s exclusion.” Casey Kreer, Digital Accessibility Expert
Deceptive design is more than just an annoyance – it’s a structural barrier. Beyond cookie banners, other practices can also exclude or disadvantage users. Constant pop-ups, for example, may impair navigation or intentionally create stress to drive faster purchases. Who takes time to compare prices when the site says five other people are eyeing the last room?
Pathways to Better Solutions
There are several examples of how things can be done better. When accessibility is considered from the very beginning, it leads to a better user experience for everyone. Based on international standards and EU directives, public authorities have been required since 2016 to implement accessibility standards in their digital services – standards that offer important approaches for more inclusive digital access.
Other Standards like Do Not Track or the newer Global Privacy Control (GPC) allow users to indicate they don’t want to be tracked. While GDPR already requires consent for tracking, GPC provides an extra layer of protection, particularly when using VPNs.
There are also pattern libraries that offer technical guidance for user-centered design. One such resource is Decent Patterns , a library with assets and design patterns that developers and designers can use to create better, decentralized digital services. But to ensure these approaches are adopted broadly, we need clear legal frameworks. The focus must be on empowering users – through education, transparent infrastructure, and inclusive processes.
A major issue is the continued framing of deceptive design as isolated bad practice. In reality, it is often a deliberate business model. This framing individualizes harm, instead of addressing power asymmetries, structural vulnerabilities, or the collective impact of manipulative, addictive design.
An Online Event Is Just the Beginning
In our first online session as part of the Forum for Digital Fairness and Consumer Protection, we discussed deceptive design with Casey Kreer, Judith Faßbender, and Eileen Wagner. This session marks the start of a timely conversation: How can we shape consumer protection online in a more just and feminist way? What does it take to recognize and prevent manipulative designs, and to build fairer digital spaces? Our thanks to the speakers for sharing their insights with more than 30 participants!
Why this project? And why start with deceptive design?
This event is part of our ongoing project Forum for Digital Fairness and Consumer Protection, funded by the German Federal Ministry for the Environment, Nature Conservation, Nuclear Safety and Consumer Protection (BMUV). The project aims to develop new perspectives on digital consumer protection – intersectional, power-critical, and future-oriented. Rather than sticking to traditional consumer protection topics, we explore structural inequalities and systemic barriers.
We began the series by focusing on where digital self-determination is most visibly undermined: interface design and online decision architectures. Deceptive design exemplifies a practice that doesn’t protect users, it steers them. Often subtly, but with far-reaching consequences, especially for marginalized communities online. Recognizing, challenging, and changing these patterns is key to building more just digital spaces.