UN Special Rapporteur Warns of Racial Discrimination Exacerbated by Technology

By Nani Jansen Reventlow, 15th July 2020

The Digital Freedom Fund welcomes the publication of the report “Racial discrimination and emerging digital technologies: a human rights analysis” by the UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, E. Tendayi Achiume.  

The report goes beyond analysing specific instances of racial discrimination on digital platforms to assess how emerging digital technologies perpetuate racial discrimination on a structural level. Examining the racially discriminatory consequences of algorithmic decision-making throughout public and private life, the report sheds light on how existing inequalities, biases, and assumptions in the design and use of digital technologies threaten the human rights of marginalised and racialised groups. The report also outlines a structural and intersectional human rights law approach to regulating digital technologies, emphasising that an effective response to racial discrimination must include efforts to break down power structures within the sectors that make decisions about the design and use of these technologies, including private technology companies, research institutions, and public regulatory bodies. 

The report goes beyond analysing specific instances of racial discrimination on digital platforms to assess how emerging digital technologies perpetuate racial discrimination on a structural level.

A cohort of digital rights NGOs, including DFF, have released a statement highlighting a number of salient points in the Special Rapporteur’s report. The statement – which is open to additional signatories – expands on the human rights framework set out in the report, drawing attention to several specific commitments which state and corporate actors must uphold to prevent and remedy the discriminatory impacts of emerging digital technologies. 

The NGOs assert that certain technologies should be banned outright. Incremental regulation, they argue, is not appropriate for technologies such as facial recognition and predictive analytics that are demonstrably likely to cause racially discriminatory harm. The statement also emphasises the importance of keeping access to technology at the forefront of dialogues about racial discrimination in the design of digital technologies. The digital divide between the global South and the global North will perpetuate and deepen existing global inequalities as societies increasingly rely on digital platforms to distribute public goods, medical care, and education. The digital divide is not limited to less-resourced countries, either. In the US, for example, lack of basic internet access falls disproportionately on Black, Latino, and American Indian communities. 

The NGOs also elaborate on the Special Rapporteur’s account of how the values and practices of the technology field must change in order to ensure that digital tools do not replicate racially discriminatory structures. The statement echoes the report’s challenge against Silicon Valley “technochauvinism,” or the idea that technology is the best solution to social problems, and expands on the importance of including people who have experienced the impact of racially discriminatory technology in the design process and compensating them for their contributions. In discussing racial equality data, the NGOs took a step beyond exploring ways to dismantle structural racism in the technology industry to examine how even efforts to combat digital discrimination have the potential to perpetuate racial hierarchies. 

While the Special Rapporteur advocates for data collection to help address racial inequities, the NGOs highlight the risk this could pose for already marginalised populations. The NGOs would nonetheless welcome, and gladly participate in, an effort to develop standards for non-extractive data collection and governance, including measures to ensure that data collection, analysis, interpretation, and dissemination do not reinforce existing racial and other hierarchies as well as address the power dynamics between data collectors and those whose data is collected.

The UN Special Rapporteur’s report outlines critical warnings about the racially discriminatory potential of technology, which is particularly urgent in the wake of the coronavirus

The UN Special Rapporteur’s report outlines critical warnings about the racially discriminatory potential of technology, which is particularly urgent in the wake of the coronavirus as governments around the world launch digital interventions in the name of the public good. DFF hopes that the report will receive broad support from civil society and that it will prompt much-needed further conversations on the issues addressed.

These conversations should include reflections on the makeup of the digital rights field itself, which, in DFF’s view, is currently too embedded in the power structures that enable the practices addressed by the Special Rapporteur to adequately fight them. We at DFF, together with EDRi, have initiated a decolonising process which aims to examine and tackle this issue. To learn more and find out how to get involved, visit the DFF website.

Update (September 2020): 80 NGOs and 55 individuals have joined the Digital Freedom Fund, Access Now, AI Now Institute, Amnesty International, Association for Progressive Communications and Internet Sans Frontières in signing on to the statement in support of the UN Special Rapporteur’s report. For the full statement and list of signatories, see here. DFF is grateful to Jessica Fjeld and Vivek Krishnamurty for their leadership in drafting the statement.

Photo by Bryan Colosky on Unsplash

Charting DFF’s First Chapters

By Nani Jansen Reventlow, 2nd July 2020

It’s here! We are extremely proud to launch DFF’s first ever annual report.

The long-awaited report charts the first leg of DFF’s journey as an organisation, from its founding in 2017 through 2019. Within its pages, you can read how an idea conceived by the digital rights field in 2016 has evolved into a fully-fledged fund that has supported 37 strategic cases across Europe – and helped spread the message of digital rights protection even further.

The report illustrates the development of DFF’s concept from its early stages until now. Threats to our digital rights stretch across a broad spectrum, so it was clear from the get-go that DFF needed to focus its ambitions.

That’s why, following close consultation with the European digital rights field, we formulated our three thematic focus areas: privacy and data protection; the free flow of information online; and accountability, transparency, and adherence to human rights standards in the use and design of technology online.

What DFF actually does also flows from this strategy process: DFF supports litigators to bring strategic cases on digital rights. But we also offer pre-litigation research grants. In parallel, we seek to leverage the vast knowledge of the European digital rights network by bringing people together, facilitating collaboration, and field building.

Our funding, research, and networking are all guided by a few core convictions. DFF believes firmly in the ‘strategic’ part of ‘strategic litigation’: in other words, we want to pave the way for cases with the potential to reach beyond the courtroom and to effect far-reaching social change.

We also never understate the importance of an intersectional approach

We also never understate the importance of an intersectional approach. We know that many digital rights issues disproportionately affect marginalised groups. We also know that many of the obstacles society faces in safeguarding human rights in the digital sphere hinge on power imbalances and structural inequality. That’s why doing our part to decolonise the field is an absolute priority for us.

In the time period covered by this new report, we’ve made 23 grants supporting 37 cases across 15 jurisdictions. Included among these is the Dutch SyRI case, which challenged state “predictive policing” and resulted in a landmark ruling earlier this year. We were delighted to be able to support the coalition of litigators who brought this case to a successful outcome, and its reverberations across the continent were clear-cut evidence of the impact such cases can have.

Apart from that, the cases we’ve supported tackled a host of today’s most urgent digital rights violations, from secret algorithms and mass surveillance to the “digital welfare state“.

We’ve made 23 grants supporting 37 cases across 15 jurisdictions

We’ve also been privileged to host several workshops through the years, where some of the brightest minds in digital rights have come together to collaborate on strategy and work on case development. DFF has facilitated conversations dedicated to competition law, GDPR, and artificial intelligence, as well as strategic litigation retreats and annual strategy meetings for field-level planning.

These first years have inspired confidence, and highlight how tangible progress can be achieved by working together as a field. With that in mind, we’re extremely excited to see what lies ahead.

Illustration by Cynthia Alonso

Taking the Competition Law Conversation Online

By Nani Jansen Reventlow, 4th June 2020

Graphic of competition law discussions, depicts grass roots activism, people taking on big tech robots towering above

Today, DFF successfully wrapped up its first wholly virtual event, which focused on competition law. Participants around the world joined us to workshop case ideas around self-preferencing and actions involving data.

In December last year, DFF organised a 1.5 day training in Brussels to explore how digital rights litigators could harness the competition law framework to further their work on issues such as data protection and freedom of expression. The meeting, with participants from Europe, the US and Latin America, was organised in response to direct requests from our network to provide more opportunities for knowledge and skill building in this area.

The December training was not only an opportunity to learn and exchange experiences using the competition law framework to advance digital rights, it also helped identify topics that participants wanted to further work on and explore potential cases on.

We were very much looking forward to doing this deeper dive in person and were planning a 2-day meeting in Berlin when the Covid-19 pandemic hit Europe. With no clear view as to when or if in-person meetings would be possible in the short term, we decided to move online to workshop case ideas around self-preferencing and actions involving data, as our network had requested.

Day #1 Visualisation

This in and of itself was a great learning experience: how do you create an online event that allows participants to get similar learnings and interactions as from a two-day meeting, while keeping “Zoom fatigue” at bay? We ended up running two sessions of 2.5 hours each in the course of an afternoon and morning.

Ahead of the workshop, we made a number of online resources available, including a set of one-pagers and videos focusing on competition law issues relevant to the workshop. 

Besides the obvious lessons (re)learned –– for example, that introduction rounds always run over, and especially online! –– we think this was a successful first online event.

Participants grappled with questions around to what extent competition law interacts with digital rights issues, such as data protection and freedom from discrimination

Participants grappled with questions around to what extent competition law interacts with digital rights issues, such as data protection and freedom from discrimination, and case ideas were formulated around some potentially anti-competitive practices that are playing out in the digital context, from intermediaries privileging their own products and services to the exclusion of others, to online platforms leveraging their access to user data to abuse their dominant position over various markets.

Day #2 Visualisation

Equally important, some conversations led to the conclusion that competition law actually wasn’t the right framework to tackle certain issues, which is a key learning in and of itself. 

One of the main conclusions was that we could have spent much more time discussing these topics, and developing case ideas. We are mapping next steps, including the possibility of offering some dedicated support for case development and litigation in this area.

Stay tuned for updates and, as always, if you have any thoughts or suggestions you’d like to share with us, please don’t hesitate to get in touch!

Participants at the workshop