Q&A: The Road to More Women in Digital Rights

By Sahar Yadegari, 11th February 2021

A group of women sit around a long meeting table with open laptops

11 February marks International Day of Women and Girls in Science. In honour of this day, we caught up with Sahar Yadegari to discuss why we need women not only in tech, but in digital rights, too.

Firstly, Sahar, can you briefly introduce yourself and your work?

I joined VHTO – Expert Centre for Gender Diversity in STEM – in October 2020 as the new director, working with a team of 9 colleagues who are passionate about increasing the participation of girls and women in the tech sector.

We test evidence-based interventions in schools and based on our findings, we inform policymakers about policies that need to change to increase gender diversity in STEM fields (Science, Technology, Engineering and Mathematics). We develop and implement gender-inclusive coding programmes for teachers and students, and work with tech companies to support them in their diversity objectives. 

What brought you, personally, to the issue of women and girls in technology? Why does it matter so much to you?

My interest in and commitment to this strand of work started a few years ago, while working at Adessium Foundation on the digitisation portfolio. I learned about how digitisation is completely transforming our societies and our way of life, making tech companies incredibly powerful. That kind of power comes with great responsibility, which means that we need to ensure that all groups in society are represented in the tech sector.

“That kind of power comes with great responsibility, which means that we need to ensure that all groups in society are represented in the tech sector”

In your opinion, why is it so important to have women and girls working not just in technology, but in digital rights?

Two reasons. Firstly, we’re not going to win the digital rights battle with a homogenous group of predominantly white men. The issues are far too complicated and all-encompassing, resources are extremely limited and the vested interests are far too great. To make any impact, we need socially diverse groups (in terms of gender, ethnicity and age) to outperform our opponents.

Secondly, examples like the massive use of deep fakes that are used to produce porn and use against ex-girlfriends or the excessive online harassment of female journalists and activists demonstrate how women are disproportionately affected in this digital era. We need women’s voices included to shine a light on all of the issues that need to be addressed.

“…we’re not going to win the digital rights battle with a homogenous group of predominantly white men. The issues are far too complicated and all-encompassing”

It’s clear that women and girls, as individuals, deserve equal access to opportunities in these fields. But do you think that including more women in this realm has the potential to change, more fundamentally, how things are done? Is it a matter of changing perspectives, as well as changing the numbers?

The truth is that we don’t know yet: the average of female IT specialists in the EU is only 18%. Today, the tech sector consists predominantly of men.

But given the magnificent role of tech in society, we have to ask ourselves the question: what kinds of inventions or applications are we missing out on now because of underrepresentation of women in the tech sector? Would our tech ecosystem still look the same if the designers, developers and builders represented all groups in society?

In other fields, such as medicine, we have witnessed a better representation of women’s interests as the sector grew more diverse. That is a promising perspective.

Based on your work in this area, what are some of the most effective approaches to closing the gender gap in technology and digital rights?

Closing the gender gap requires a number of interventions at a very early age. There are several steps. Embedding inclusive programming and coding education in primary schools and demystifying tech in the education system. As long as coding is not an integral part of the curriculum, boys will benefit more from informal coding workshops and they will enter the university with more pre-existing knowledge, a gap that’s hard for girls to close.

“Closing the gender gap requires a number of interventions at a very early age”

We need an inclusive learning environment that focuses on increasing the self-confidence of girls (as they tend to underestimate their performance in STEM fields) and is focused on a ‘growth mindset’ (every kid can become stronger in math if they put in the effort, it’s not a fixed given).

Tackling the gender stereotyping of parents and teachers is incredibly important as well as the existence of role models: you can’t be what you can’t see.

Who, in particular, do you believe has a responsibility to make these important steps happen?

We all have a responsibility: parents can set the right example for their kids starting today. It’s quite shocking how soon boys tend to develop better spatial ability skills than girls just because they’re more nudged towards playing video games or playing with LEGO.

But obviously companies and government have a crucial role to play as well. Making inclusive coding education available in schools is something the Dutch government still hasn’t achieved. And tech companies need to seriously commit to increasing diversity, not just because it’s good for sales and revenue, but also because it will increase their added value to society.

Since you began working in this area, have you seen any notable shifts take place? What has changed in recent years, for better or worse?

I only have four months to reflect on, but I do think that the MeToo movement as well as BLM, but also the issue of disinformation and election manipulation have contributed to a more fertile environment for addressing the issue of lack of diversity in the tech sector. The key challenge right now is to use this momentum and promote meaningful policy changes.

“The key challenge right now is to use this momentum and promote meaningful policy changes”

How do you envision a future for women and girls in STEM, 10 years from now?

Obviously much higher numbers. Why not reach 50% of female IT specialists in the EU by 2031? But most importantly, I would hope to see women’s perspectives and interests reflected in the types of tools and technologies that we find useful and acceptable in our society.

Photo by Christina @ wocintechchat.com on Unsplash

New Format, New World: Our Strategy Meeting 2021

By Nani Jansen Reventlow, 9th February 2021

A DFF tote bag reading 'Digital rights are human rights' and a face mask reading the same

In light of a very different year, DFF is preparing for a very different annual strategy meeting.

As we get ready to kick off the fourth instalment of our yearly gathering, we’re reckoning not only with a new format, but with a rapidly changing digital rights landscape.

Since the first meeting DFF hosted in Berlin in 2018, we’ve been lucky to receive an ever-growing number of participants, from increasingly varied organisations and geographical locations to jointly discuss current issues in digital rights and collaboratively plan and strategise for the months ahead.

And while we are luckily seeing this trend continue for our 2021 meeting next week, where we’ll be welcoming our biggest number of new participants yet, this meeting, like everyone’s meetings and conferences at the moment, will be online.

Instead of huddling together in a hip Berlin bar for welcome drinks, we’ll now endeavour to mix some cocktails and mocktails at home and dance to the DJ’s tunes in our respective living rooms.  

Digital rights organisations suddenly had to divert their attention towards a stream of new threats, such as invasive “Corona apps” or governments abusing lockdown emergency laws

These changes notwithstanding, we are very excited to virtually come together with so many old and new friends and collaborators next week –– there certainly is a lot for us to discuss, strategise and share information on. 

In 2020 the context for digital rights was dominated by the COVID-19 pandemic. As I wrote in April last year, the COVID-19 pandemic also posed a crisis for digital rights.

Digital rights organisations suddenly had to divert their attention towards a stream of new threats, such as invasive “Corona apps” or governments abusing lockdown emergency laws to expand digital surveillance. Thermal scanners were being brought into airports, workplaces and schools, artificial intelligence is being used to allocate health resources, and there are reports of health apps or digital certificates becoming mandatory to access food and medicine.  

Organisations across the digital rights field and beyond have been actively advocating for governments to tackle the pandemic in a way that ensures the use of digital technologies is in line with human rights.

There were some positive examples, such as when Germany developed an open-source contact tracing app that does not track users’ locations or store data in a centralised location.

There were some positive examples, such as when Germany developed an open-source contact tracing app that does not track users’ locations

However, indications are that examples like this are the exception, rather than the rule. As governments reduced lockdown restrictions over the summer, new risks started to emerge with the introduction of measures that increased inequalities related to freedom of movement, access to public spaces, and the ability to work and access essential services.

The first COVID-19 vaccines approved in late 2020 will hopefully bring relief for some countries in the course of 2021, but this, too, could pose further threats to digital rights. One example is the uptake of digital vaccination certificates that may be required to travel or access certain places and services, which could risk further marginalising millions of people that do not have access to smartphones or live in places that have delayed access to vaccinations.  

In circumstances that were unprecedented for most, in which home working, schooling and care duties had to be combined, the digital rights field has been impressively quick to rise to the challenge. Even in the early stages of the pandemic there were already examples of strategic litigation being used to halt digital rights violations, including a successful challenge of cell phone tracking in Israel and a ban on the use of surveillance drones in France.

Even in the early stages of the pandemic there were already examples of strategic litigation being used to halt digital rights violations

In the UK, the threat of judicial review was enough to see the UK National Health Service admit they began a coronavirus test-and-trace programme without carrying out a privacy assessment, and for the UK government to stop use of an unfair grading algorithm.

We expect digital rights issues as they intersect with health and education to be high on the agenda next week, as well as the need to make progress with a decolonising process for the digital rights field, which was a topic of discussion at previous DFF strategy meetings, and initial steps have been taken on last year. We hope more generally that participants will be able to bring a decolonising lens to many of the digital rights topics we’ll discuss, be it data protection, platform power or e-government services.

Other topics that will be central in the field’s work this coming year are follow-up to the European Commission’s “white paper on artificial intelligence” and proposal for the new Digital Services Act and Digital Markets Act.

Facial recognition technology promises to be an ongoing area of focus in 2021. In July, the UN Rapporteur on contemporary forms of racism, E. Tendayi Achiume, published a report assessing how emerging digital technologies perpetuate racial discrimination on a structural level. The call for the banning of technologies with a clear discriminatory impact, such as facial recognition, continues.

In that context, it will be interesting to see how we can build on the victory Liberty obtained in the UK in August 2020, with the courts ruling that the South Wales Police’s use of facial recognition technology violates privacy rights, data protection and equality laws.

…we have dedicated space on the agenda to reflect on well-being and sharing best practices for resilience in these strange times

Mindful of the fact that we are fighting all these battles (and more!) in circumstances that are challenging to us as organisations, teams, and individuals, we have dedicated space on the agenda to reflect on well-being and sharing best practices for resilience in these strange times.

We look forward to connecting with many of you next week and will work very hard to make it an inspiring and energising experience, even online!

Inside Your Head: Defending Freedom of Thought

By Susie Alegre, 6th February 2021

A black and white image of lines that look like a brain lit up in different areas

When I first read about behavioural micro-targeting by Cambridge Analytica in January 2017, it struck me that the human rights at stake went beyond privacy and data protection.

To me, the idea that someone could use my data to reach inside my head, understand how I might be thinking or feeling and use that information to press my psychological buttons for political purposes was chilling. It was not my privacy that was at stake, it was my right to freedom of thought. 

But when I started looking for cases and academic analysis on the right to freedom of thought in the digital age, I found almost nothing. So I set out to change that by publishing an article in the European Human Rights Law Review exploring the parameters and application of the right in the era of Big Data.

The more I looked into it, the more it became clear that the problem went way beyond political behavioural micro-targeting: much of what Shoshana Zuboff has called “surveillance capitalism” raises serious questions about freedom of thought. 

…when I started looking for cases and academic analysis on the right to freedom of thought in the digital age, I found almost nothing

A DFF pre-litigation research grant allowed me to build on this work in 2019-20 exploring the potential for using the right to freedom of thought to strengthen concrete strategic litigation on digital rights.

What is the right to freedom of thought and why is it useful?

The right to freedom of thought is protected by Article 18 of the International Covenant on Civil and Political Rights (ICCPR), Article 9 of the European Convention on Human Rights (ECHR) and Article 10 of the European Union (EU) Charter on Fundamental Rights and is closely related to the right to freedom of opinion that is protected alongside freedom of expression in those instruments. 

The right has two aspects:

  • the internal aspect – what I think in the privacy of my own head; and
  • the external aspect – how I express or manifest my thoughts and beliefs

There are three main planks to the protection provided by the right to freedom of thought:

  • The right not to reveal our thoughts.
  • The right not to have our thoughts manipulated.
  • The right not to be punished for our thoughts.

All three are potentially relevant in the digital age, where algorithmic processing of Big Data is relied on to profile and understand individual’s thought processes in real time for the purpose of targeting them with tailored advertising and other content or to inform automated decisions that can affect their lives.

…algorithmic processing of Big Data is relied on to profile and understand individual’s thought processes in real time

Most rights, like privacy and freedom of expression, can be limited in human rights law, for example in the interests of health or security.  But the internal aspect of the right to freedom of thought is absolute. This makes it a particularly powerful legal tool because it means that, if something amounts to an interference with the right, there can be no justification that would make it lawful. 

Scoping out what an interference with freedom of thought would look like in the digital space is, therefore, helpful for identifying what can never be justified in human rights law.

How can it be used in strategic litigation?

The right to freedom of thought does not necessarily need to be used as a standalone right in strategic litigation.  As these arguments are novel, it may be better to use it to bolster arguments on protection of personal data or the right to private life, whether in courts or before supervisory authorities charged with enforcing the General Data Protection Regulation (GDPR) or other relevant regulatory bodies.

GDPR and EU Law

The right to freedom of thought in the EU Charter can be used as an interpretative tool for the GDPR or other laws implementing EU legislation.  The Court of Justice of the European Union (CJEU) has made clear that [s]ince the fundamental rights guaranteed by the Charter must […] be complied with where national legislation falls within the scope of European Union law, situations cannot exist which are covered in that way by European Union law without those fundamental rights being applicable. The applicability of European Union law entails applicability of the fundamental rights guaranteed by the Charter.”

Article 10 of the EU Charter can be used in submissions on the interpretation and application of the GDPR before supervisory authorities or courts where EU law is relevant.  It is particularly relevant to the question of lawfulness. A failure to interpret the GDPR in light of Charter rights could give rise to a request for referral to the CJEU to clarify the scope and application of the right in a way that would have impact across the EU.

As an example of the way these arguments could be developed, I published a legal opinion on the application of the rights to freedom of thought and mental integrity designed to supplement a complaint to the CNIL (the French Data Protection Authority) by Privacy International on data sharing by mental health websites.

ECHR and Constitutional Rights

The right to freedom of thought is also reflected in many domestic constitutional or human rights frameworks and may be used in national human rights or constitutional challenges.

Domestic human rights challenges may ultimately lead to freedom of thought litigation before the European Court of Human Rights

An example of this was the use of the right to “ideological freedom” in the Spanish Constitution to challenge the legality of Spanish data protection law on the collection and use of personal data by political parties. Domestic human rights challenges may ultimately lead to freedom of thought litigation before the European Court of Human Rights to clarify the scope of rights in the ECHR in a way that could have impact across Europe.

What happens next?

The Digital Freedom Fund pre-litigation grant is hopefully just the beginning. Those interested can see a summary report of my research, which provides more information on ways to use freedom of thought in potential strategic litigation and possible fact patterns. And the legal opinion on mental health websites is published and available for anyone who wants to challenge the type of practices revealed in the Privacy International report “Your Mental Health for Sale.”

It has been a really exciting year, reaching out to so many digital rights activists and organisations to advance these arguments and find ways of using them to make digital rights real. I hope this project will serve to inspire new cases and new arguments to strengthen strategic litigation. If you want to find out how these arguments could fit into your work, please do get in touch

Susie Alegre is an international human rights lawyer, author and barrister at Doughty Street Chambers specialising in the right to freedom of thought and technology.

Related links:

(ENG) Freedom of Thought: Findings, Legal Analysis and Next Steps

(ESP) La libertad de pensamiento: Hallazgos, análisis legal y próximos pasos

(ENG) Legal Opinion: The Right to Freedom of Thought and Data Sharing from Mental Health Websites

(ESP) Opinión: Del derecho a la libertad de pensamiento