Digital rights and children's rights: If we’re to believe the debate taking place today, these two issues are irreconcilable. But we (and many others!) have been saying for a long time now that we do not have to choose between one or the other – and that, actually, we can only have both if they work together.
In 2022, we observed how children's rights and digital rights were being played off against each other on the political stage. The reason for this was so-called “chat control,” part of a larger EU legislative project “to prevent and combat the sexual abuse of children.” Plans for the law included searching digital communications for “depictions of abuse and grooming.” Shortly afterwards, the idea of introducing online age verification was added. At the time, we addressed both chat control and age verification from a feminist perspective.
Our conclusion at the time was that this debate urgently needs a feminist, empowering, nuanced, and informed perspective. All too often, adults talk about children instead of with them, and without genuine dialogue, adults cannot deliver solutions that meet children's needs.
Technology is not the solution
In recent years, two major observations continue to shape our work with children online. First, we see the role of technology being greatly overestimated in so many areas of our lives, including in this case. This applies both in the extent to which digital technologies harm children (not denying actual harms; but rather, wanting those harms to be assessed for what they are); and also, to how helpful technology can be in mitigating the real harms. At SUPERRR, we focus on people rather than technology, and that also means focusing on social prevention work rather than half-baked technical barriers.
And secondly, it seems almost impossible to approach the issue in a balanced and nuanced way. Children have rights (as stated in the United Nations Convention on the Rights of the Child, even if Germany has still not enshrined them in its constitution). These rights include the right to protection, but also the right to participation and support. But here’s the thing: the majority of proposed technical measures are aimed exclusively at protecting children, in a way which stands in the way of participation and support. We believe all their rights matter, and that it’s possible to have both.
Bans are booming
In 2025, we worked with other organisations to constructively consider different methods of age verification, such as those repeatedly called for in the context of chat monitoring. However, we stand by what we already stated in our 2023 analysis: “Safe spaces for children and young people cannot not be created through the massive use of surveillance and control technologies, because care cannot be engineered.”
We’ll continue to work on this issue from an intersectional, feminist perspective – not least because the year began with calls for a social media ban for children (under 14 or even under 16) in Germany, the EU, and many countries around the world. Age verification procedures are a key prerequisite for such a ban—in other words, mechanisms that actively exclude children. And public demands for a social media ban seem to ignore the fact that no country that has passed such a law has yet succeeded in implementing it in a suitable, let alone effective, manner. That is precisely why we will continue to take a close look and ask: what does such a ban really achieve, and what price do we (and our children) have to pay for it?
We are convinced that digital rights and children's rights go hand in hand. That is why we continue to advocate for better, fairer, and above all, accessible digital spaces. Blanket bans on access do not contribute to this.