The Gender Divide in Digital Rights
The Gender Divide in Digital Rights
By Nani Jansen Reventlow, 3rd March 2020
March 8th marked International Women’s Day, a time when countries around the world seize the opportunity to celebrate womanhood, and to condemn the ongoing struggles facing women globally.
It’s an opportune moment to celebrate the great achievements made by women in the digital rights field. We are fortunate as a field to have a relatively good gender balance, as well as many formidable woman leaders. International Women’s Day (but actually, any day) is an occasion to mark and acknowledge their fantastic work, as well as the impressive and inspiring work of women academics, social scientists, technologists, lawyers and activists who support and build on the work of the field.
Over the last few decades, societies across the globe have made great strides towards gender equality and this has coincided with a huge leap in technological advancement. However, we would be wrong to assume that the digital age has brought an end to bias and discrimination against women.
…we would be wrong to assume that the digital age has brought an end to bias and discrimination
A quick survey of the power structures at play across key digital rights issues – from the gender-biased nature of certain technologies, to the deepening “digital divide” – throws the relevance of women’s struggle for gender equality into sharp relief.
Is Technology Sexist?
The fact that evolving technologies, such as AI and other automated systems, are sexist has almost become a truism in the digital rights field. Technology inevitably reflects the cultural biases of its creators and the societal context in which it is built, meaning that gender discrimination, conscious or otherwise, is routinely transposed to these digital systems. Sexism is subsequently reproduced, embedded, and even exacerbated by new technologies.
Technology inevitably reflects the cultural biases of its creators and the societal context in which it is built
Real life examples of this abound, from Amazon being forced to scrap an AI recruiting tool that favoured men, to an Apple credit card system being accused of offering women lower credit limits.
Then there’s facial recognition, a burgeoning challenge to human rights and privacy the world over. Software used to identify individuals by matching a biometric map of their faces with those on a database.
This technology has been proven to work with less precision on women and people of colour, resulting in a higher proportion of “false positives.” In other words, if you are a woman or a person of colour you are more likely to be incorrectly matched to someone in a database by facial recognition technology than if you were a white man. This is of particular concern when such technology is used to inform decisions made by law enforcement and other state authorities.
…if you are a woman or a person of colour you are more likely to be incorrectly matched to someone in a database by facial recognition technology
As well as the higher risk of misidentifying non-male, non-white individuals, the fact that facial recognition is increasingly used to monitor protests means that those exercising their right to free assembly have a greater likelihood of being targeted by this intrusive technology – including those fighting for social issues such as women’s rights. The deployment of such technology may deter individuals from attending and expressing themselves freely at such protests.
The Digital Divide
The “digital divide” was once used predominantly to describe the discrepancy between those who have access to the internet and those who do not. But it can also be understood as the widening chasm between the small minority who wield the potential of ever-more powerful technology, and the everyday person subject to its influence. Big Tech monopolises global markets, while states continue to grow their access to potentially dangerous technological tools.
The gender aspect of this “digital divide” should not be a surprise to anyone: Silicon Valley has a notorious “brogrammer” problem. To put this into perspective, consider that in 2018 Google’s workforce was only 30 per cent women (the company fared even worse when it came to other diversity figures), or that in 2016 six of the ten biggest tech companies did not have a single female executive.
The so-called digital welfare state, which refers to the growing use of automated decision-making in social services, also spells trouble for gender equality. In her book “Automating Inequality”, Virginia Eubanks argues that the digital welfare state leaves society’s most vulnerable as particular targets of AI and algorithms that are inherently prejudiced or in breach human rights. Given that, in many countries, women in general, as well as doubly marginalised groups such as single mothers, are significantly more likely to experience poverty, the digitisation of welfare affects them disproportionately.
Intersectionality
Of course, most of these issues cannot and should not be isolated as gender-specific. The overlap and piling up of prejudice across multiple social categories is an irrefutable feature of social inequality in today’s world.
Is AI sexist? My answer was yes, absolutely. But what about it being racist, ableist, and hetero-centric?
Recently, I spoke at an event in Berlin that asked: is AI sexist? My answer was yes, absolutely. But what about it being racist, ableist, and hetero-centric? We need to broaden the question when we discuss bias in technology. And that goes for the multitude of other equality problems in the field, too.
At DFF, we recently decided to begin a process of decolonisation in the digital rights field. Through this, we hope to go some way towards reimagining troubling assumptions and prompting bigger, structural change.
Of course, by homing in on the challenges, we may sometimes overlook causes for optimism. Indeed, it should be clear to anybody working in the digital rights field that we are surrounded by trailblazing female role models. It’s also heartening that many of us are so ready and willing to work on bringing about change.
But this International Women’s Day remains an apt moment for reflection on the fact that systemic problems still do exist – and it’s the responsibility of all of us to continue tackling them.