pink image with orange heart and wifi icon

What safety online means to us

Is the Internet unsafe? And for whom do we need to make it safer?

The answer is as complex as humanity itself, and it depends a lot on context. Who we are, which identities we hold, where we are in the physical world, what systems of oppression we are subject to – to name just a few factors.

For people who face oppression because of who they are or what they need, the internet has often provided vital connections and information. For example: The internet has long provided important spaces for queer communities to build connections outside of public scrutiny. For people living in places with restrictive reproductive rights, it can provide access to crucial information about health and reproduction (like how and where to get an abortion, for example.)

Bullying on social media, algorithmic discrimination and corporate or state surveillance of our digital communication all have their counterparts in the analogue world. Creating safer spaces is an ongoing issue online as much as offline – which means that looking for solutions only in the tech won’t solve the problem. 

How do we make the internet safer?

As people need very different things from the internet, there is no single approach that works for all and every occasion. That is why taking an intersectional approach is key. We ask ourselves: who might be affected by suggested changes?

Mechanisms that ban children from accessing adult content have a tendency to also make it harder for them to access content on their bodily rights and self determination – or they simply lead to intended or unintended overblocking of entirely legal content. Which one is worse, which one is worth it?

Feminism and Safety

Intersectional thought and practice offers various concepts for creating spaces for communities. From safe(r) spaces and brave spaces to accountable spaces: There is no right way to shape our interactions with each other, but it is crucial that we do so with care, transparency and love. And maybe this is what we need: Different solutions to serve different needs instead of the “global village” one-size-fits-all solution. Context is queen!

In the current moment, we often fall into the trap of thinking about the internet just as social media platforms, when it is so much more than that. Our access to the internet doesn‘t have to be mediated just by social media platforms and Big Tech. And moving away from those platforms gives us more options to create safe pockets of the internet for ourselves.

For this Safer Internet Day, ask yourself: where do you feel safe online? What contributes to you feeling safe? How can you help create spaces where other people feel safe? This can take many different forms: stepping in online if you see someone being bullied. Writing Wikipedia articles to help create more accessible information for others. Be mindful what you communicate about the people around you, and on which platforms – a picture shared online is communication, too. Moving your social media accounts to the Fediverse, or contributing to community safety guidelines.

A Safer Internet needs one thing for sure: us.

How our work ties into a creating a safer internet

We work on digital fairness! Digital markets claim to offer choice and convenience, but too often they rely on manipulation, opaque systems, and unequal power relations. From deceptive design and personalized pricing to constant tracking, many online services are built to extract consent and data rather than protect people. A fair internet means recognising that these harms are structural, not individual failures. It means strengthening consumer rights, transparency, and collective protections - so people don’t have to navigate digital markets alone or blame themselves for systems designed against them.

Our work on digital violence shows how interconnected digital and physical spaces – and problems – are. AI-scandals like Grok show that digital technologies are increasingly used to harm, exploit, and control women, children, and marginalized communities. These harms are not just technical problems, but social ones rooted patriarchal systems of oppression. A safer internet means recognising digital violence, taking survivors seriously – also by translating their needs and experiences into policy – and strengthening preventative measures, education, and long-term support systems. We need holistic answers to growing problems like digital violence, without relying on technical fixes or punitive measures alone.

A safer internet must protect not only users, but also the data workers and content moderators who keep social media platforms and AI systems safe for everyone. This essential labor is often performed under harsh conditions and systematically rendered invisible. Building a truly safe internet requires strong labor standards, meaningful protections and recognition for the people whose work sustains digital ecosystems. Without safeguarding their rights and dignity, claims of online safety and ethical technology remain incomplete.