The First Steps to Decolonise Digital Rights
The First Steps to Decolonise Digital Rights
By Sarah Chander, 24th September 2020
In early 2020, DFF and its project partner EDRi started their joint work of initiating a decolonising process for the digital rights field in Europe. How does this fit into the current landscape of digital rights and recent developments in the movement for racial and social justice? And what have we been up to these past months?
The Social Dilemma and other pop-culture portrayals have brought the tech industry into even sharper focus. One major concern is that this industry is a tiny, unrepresentative yet powerful, minority holding the power to impact the everyday experiences of almost the entire world. We should ask ourselves whether this is true for many of the communities working to contest Big Tech’s dominance.
We are in a moment of global shifts. Whilst racial oppression and global inequality are by no means novel phenomena, the wave of global #BlackLivesMatter protests this summer have many of us reflecting on social and racial justice with respect to our organisations, our work, and even ourselves.
Zooming in on the digital rights field, a reckoning of the need for change has also been bubbling for a long time. Following some initial discussions, DFF and European Digital Rights (EDRi) teamed up to explore how we might go about developing a process for decolonising the digital rights field. This blog is an update on our work at these early stages.
Why “decolonise” the field?
This process was conceived through various observations of the digital rights field in Europe. Over the years, we have seen a lack of representation of all people we seek to protect from harms. This undoubtedly impacts our work – specifically the extent to which the field is equipped to meaningfully defend the rights of all.
Over the years, we have seen a lack of representation of all people we seek to protect from harms. This undoubtedly impacts our work
This overwhelming lack of representation in our movement matters. The (near) absence of those who are racialised, gendered, queer and trans, working class, differently-abled, and hailing from the global south, affects our work and our perspectives more than we know. It is a flaw, a weak spot. It compromises our ability to make good on a commitment to uphold the digital rights of all in society. This absence shows up in a number of ways.
One way this unfolds is an assumption of universality with respect to the holder of digital rights – the ‘user’ of technology. Who do we envisage when we discuss defending the rights of the ‘user’? Generally, we don’t envisage anyone in particular – neutral to life circumstances and personal characteristics.
However, often when we assume universality, objectivity, and neutrality, what we do is embed dominance; we focus on the experiences that are most represented and similar to what we know. We centre our work on the assumption – in the words of Dr Ruha Benjamin – that there is a “universal user who is not racialized, is not marked by gender, sex, or class.”
This universality is a myth. Instead, taking a variety of specific lenses will illuminate how technology can in effect deepen and exploit a range of social, economic, and political vulnerabilities.
Missed opportunities
The issue of representation also undoubtedly limits our perspectives and our potential, too. In particular, the limited engagement of the European digital rights field with counterparts in the global south means we miss out on the necessary learning we need to understand the vast array of digital harms that are ongoing, their global impact, context, and their place in our collective histories.
As we contest the extractive nature of surveillance capitalism, we would gain much from placing our fight in a much longer trajectory of colonialism, with data relations a new commodity of our time.
More practically, this problem shows itself in the fruits of our labour. Even our most pivotal tools do not have answers to the most serious issues facing marginalised communities.
So far, data protection and privacy rights and the GDPR have been of limited use in protecting against group-based threats and the potential for discriminatory algorithmic profiling, for example those who may be overpoliced as the result of predictive policing tools.
So, the mechanism works for the individual who is informed and in a position to make their individual rights actionable, but less so for others, who ‘data protection’ was not modelled for. Just as we speak about harmful technologies as a result of skewed design, this argument applies to our legal tools too.
Just as we speak about harmful technologies as a result of skewed design, this argument applies to our legal tools too
These examples show us that the need for change goes beyond the need to simply adjust the composition of people in the room. It’s about ensuring that all of our stories, realities and the ‘multiple kinds of expertise’ we bring are reflected and valued in the field – its tools, its work, its approach. There is a growing, intuitive knowledge that a change is overdue for the digital rights field.
First steps
How to go about something that sounds so huge? So far, we have approached cautiously questions around a decolonising process for the digital rights field. What does it mean? How may we achieve it? Who else do we need to be talking to?
Taking baby steps, we started by speaking to organisations, collectives, activists, and others currently outside the digital rights field to understand how they engage with digital rights issues. How do organisations working on workers’ rights, LGBT, anti-racist, or disability rights see digital rights and the field itself? Do they see the links with their work? How do they understand the concept of decolonisation? What processes of change have they seen work?
So far, the project has been met with encouraging levels of enthusiasm. Over 30 individuals and organisations have taken the time to discuss these questions with us. What we already see is that there is huge appetite and potential from activists and civil society working outside of the digital rights field to engage, to learn more, and to establish connections with their work and “digital rights”.
What we already see is that there is huge appetite and potential from activists and civil society working outside of the digital rights field to engage
We’ve discussed and questioned many things – from the colonialism of the human rights framework, the relation of this to justice, and the limits of the “diversity and inclusion” approach. This thinking will feed into our further work.
Now we are starting conversations with the digital rights field. We hope to get a picture of the different visions, sensitivities, and understandings within the field. What is the impact of representation on our work? How may we address this? At what stages of the process are different actors in the field?
The next step is to bring interested stakeholders together in an (online) gathering to get insight into how we may go about designing a decolonising process for the digital rights field. We are excited – this will be the first time those who have interacted with the project will come together, and we hope to benefit from the range of different perspectives, build on what we have already learned and develop concrete next steps for the design process.
The next step is to bring interested stakeholders together in an (online) gathering to get insight into how we may go about designing a decolonising process
We know that this project itself cannot dismantle the world’s uneven power relations – but we hope to do what we can from our corner.
Sarah Chander is Senior Policy Advisor at European Digital Rights (EDRi). She leads EDRi’s policy work on artificial intelligence and connecting digital rights with wider movements for equality and justice.