Foxglove Co-Founder Martha Dark Talks About Challenging Big Tech Workplace Abuse and Making Tech Fair for All - An Interview

Illustration of Martha Dark

Martha Dark is the co-founder and director of Foxglove, a non-profit organisation and advocacy group dedicated to improving the working conditions of content moderators, among other things. In addition to investigating, litigating, and campaigning on technology and social justice matters, Foxglove strives to challenge the harms caused by Big Tech and make technology fair and accessible to everyone. Today, we have the pleasure of talking to Martha about these crucial issues.

SUPERRR: Dear Martha, how would you describe Foxglove to someone who has never heard of it?

Martha: Foxglove is an organisation that exists to make the use of technology fair. Technology is increasingly used by both governments and private companies to exercise power, without proper safeguards and accountability, and Foxglove aims to counterbalance that power and shift some of it back into the hands of citizens and workers. We also bring legal cases against the UK government when they use unfair or discriminatory algorithms and we bring legal cases against big tech companies when they abuse their power or their workers.

SUPERRR: The work Foxglove does is really important because many people are not aware of their rights in digital spaces, especially marginalised communities, who are at higher risk. So what exactly inspired you to found Foxglove? Was there a particular experience that prompted it, or was it more of an amalgamation of things that happened?

Martha: Yeah, so I founded Foxglove with two friends of mine. Two of us worked for an organisation called “Reprieve”, which dealt with issues relating to the “War on Terror”. During this time, we witnessed numerous instances where technology was used, such as in drone strikes, resulting in the deaths of civilians and other harmful consequences. It was alarming to think that such life or death decisions were made with the click of a button. We spent a lot of time researching the space to avoid replicating work others were doing, and we saw an opportunity for litigation. That's how we started Foxglove, which was initially a working title, but the name stuck. Foxglove is also the name of an English wildflower, which is similar to technology in that it can be both a cure and a poison.

SUPERRR: Speaking of wildflowers, I wanted to ask you about the foxglove flower, because I think it's a pretty apt analogy for the work that you do. You said it's like technology in that it can be both a harm and a cure. When we take this analogy and apply it to practice, how has Foxglove stepped in to make sure that the outcome is ultimately for the greater good, and not just for a select few? How do you ensure that technology is used for good?

Martha: Foxglove has brought and won multiple cases against the UK Government for using problematic, discriminatory or otherwise unfair algorithms. And every time we have brought a case challenging one of these systems, we haven't got to court because the system has been rolled back before we've got to court. So it's clear that the system is so inherently unfair or discriminatory that actually, as soon as it's challenged, they stop using it. A lot of the work is using the law to ensure that technology is used in a fair way as it relates to the public sector. One of the interesting things about the public sector work is that algorithmic decision making is a relatively new way to make decisions, and it is not something that the public has ever voted on. It isn't transparent or open, and it isn't something that is consulted upon. The public has never been asked if this is the way that decisions should be made.

SUPERRR: On a related note - can you describe a specific project or case that Foxglove has been involved in or currently works on? How does it align with the organisation‘s mission to make tech fair and accessible for everyone?

Martha: An example is our very first case, where we challenged the Home Office in the UK, with our partners the Joint Council for Welfare of Immigrants for using an algorithm that discriminated on the basis of nationality. If you held some nationalities, which were secret, you were automatically put in the red queue. It meant that your application was treated materially differently than if you were put in the green queue. If you held some nationalities, you just did not get a visa, which was unlawful and we believed in a breach of the 'Equality Act.'

Currently, we're working on a case with the Greater Manchester Coalition of Disabled People is challenging the Department of Work and Pensions, for using an algorithm we think unfairly flags disabled people for benefit fraud investigations. The impact of that is that many disabled people across the UK, are being flagged for benefit fraud investigations much more regularly than people who are not disabled. So, we use the law, and alongside the legal cases, we run campaigns that engage the public. Whether that's campaigns engaging members of the public to donate to make the case possible, or to sign a petition or write to their MP about an issue. Or it might be engaging MPs directly in Parliament as well.

SUPERRR: Thanks for the insight! Shifting gears, can you provide some background on former content moderator Daniel Motaung's legal case against Meta/Facebook and Sama in Kenya? In the German context, many people are unaware of the awful working conditions of outsourced content moderators in the Global South.

Martha: So, Daniel is a former content moderator who lives in South Africa and was a content moderator in Nairobi, Kenya. As you say, the working conditions in Sama in Kenya are pretty awful. The pay was less than $2 an hour, and there was no meaningful psychological care to ensure that moderators, who were looking at all sorts of awful content, day in and day out - including beheadings and child pornography. As a result, Daniel and his colleagues tried to form a union. Consequently, Daniel and his colleagues was fired by Facebook in 2019. Now he's bringing a case against Facebook and Sama, seeking changes to the way Facebook treats content moderators because Facebook outsources content moderators. We don't know exactly how many moderators there are because Facebook refuses to say, but we think there are about 15,000, including many in Germany who are attending the summit that Foxglove, SUPERRR Lab, Aspiration Tech and ver.di are running

Daniel's case is fighting for fairer conditions for moderators and the right to form a union. Recently, Facebook was trying to argue that there was no jurisdiction to sue them in Kenya, despite Nairobi being the content moderation hub for all of South and East Africa, which is about 500 million people, and millions of users and advertising revenue from that region. They were saying that because they did not have a registered office there, they could not be sued. The judge ruled that Daniel was correct and that Daniel did have jurisdiction to sue in Kenya. Daniel's case is essential not just from a workers’ rights perspective but also because there's much debate about how we keep the internet safe, how we moderate content, and what content should stay up and what should go down. The reality is that there's no way that the internet can be a safe place until the work conditions of content moderators have improved. We can't expect a safe internet when those protecting it are working in such horrible conditions.

We will have between 40 and 50 moderators gathering in Berlin from across Germany on March 9th and 10th. Unlike in Kenya, Germany has a strong labour union, and many moderators are already members of ver.di. Some are even forming workers' councils. The movement to organise and fight back against Big Tech is growing across the world, as we have seen with Amazon workers in the US and with my upcoming visit to Coventry to join the picket line of striking Amazon workers in the UK. There's a real moment of worker power building against Big Tech, who have built their wealth and power by crushing workers' rights for decades. It's an interesting time.

SUPERRR: Yes, indeed. When you were telling me about Daniel Motaung's case, I was struck by its historical significance – I’m sure, people will talk about this case in the future, if not now!

Martha: Yeah! It's the first case, to my knowledge across the world of an individual content moderator bringing a case against Facebook. So I think it's a really interesting and important landmark case!

SUPERRR: It's amazing to see how Foxglove is making an impact on the fight against Big Tech. Who are the key players and networks that empower and uplift your work, and what are the partnerships you rely on to drive meaningful change?

Martha: Yeah, so we're really pleased to be partnering with the summit team. So, ver.di is an amazing union. And it's lovely to be working with them. The idea for the summit came from the Digital Futures Gathering that SUPERRR Lab hosted back in October, and Aspiration Tech are going to be amazing facilitators and help make a really good conversation happen. So we're excited about that. But all Foxglove's cases are taken in partnership.

SUPERRR: Last, but not least, please fill in the gaps: "I know we are winning, when...."

Martha: "......Big Tech is broken up, it's too big to exist is a threat to our societies, our democracies and our workplaces."

SUPERRR: I agree! Thank you for sharing your insights and experiences with us, Martha. It was truly inspiring to hear about the work that Foxglove is doing!

Illustration: Anna Niedhart, Rainbow Unicorn.