As We Fight Climate Change, Let’s Not Forget Digital Rights

By Nani Jansen Reventlow, 4th June 2020

View of green mountains and sky

It is a sad reality that World Environment Day, which takes place this Friday, June 5th, can hardly be considered a joyous celebration of the world around us. Rather, the day serves as yet another much-needed call to action to tackle pressing environmental problems.

As the climate catastrophe looms, environmental issues will inevitably overlap with digital rights on multiple planes. Climate change has grave implications for human rights, and many of those rights relate to privacy, surveillance, AI, data, and the free flow of information – all of which fall under the umbrella of digital rights.

Nobody can deny that we are in dire need of innovative solutions, including technological ones, to tackle this unprecedented crisis. But at the same time, we must ensure that new technologies respect our human rights.

The COVID-19 pandemic has thrown into sharp relief the tendency to let respect for human rights slide in emergency circumstances. But such a situation is not sustainable and can have grave consequences. That’s why it is absolutely crucial that we don’t sacrifice people’s fundamental rights and liberties as the fight against climate change intensifies.

The COVID-19 pandemic has thrown into sharp relief the tendency to let respect for human rights slide in emergency circumstances


It’s typical in this day and age for governments to respond to crises by rolling out measures that increase surveillance and monitoring of populations. Data can certainly help in assessing situations more accurately, but it also increases government control and threatens people’s privacy and freedoms.

As time ticks on, and climate change renders more areas unliveable, or induces conflict, more and more people are being forced to flee their home countries. Asylum seekers and refugees are often at particular risk when it comes to their digital rights: states often deploy measures, such as biometric data collection and geo-tracking, in order to identify them, track their movements, and conduct “security checks”. As a vulnerable group often without a state willing to defend their interests, asylum seekers are particularly at risk as climate change worsens.

Surveilling environmental activists or hacking their devices is a common way to keep tabs on them or suppress their work

Another group at high risk of digital rights breaches are environmental activists. Defending the environment has become one of the most dangerous jobs out there: many are subjects of harassment and violence, and many have even been killed. Surveilling environmental activists or hacking their devices is a common way to keep tabs on them or suppress their work, which often runs counter to powerful interests. Activists must not only be protected, but encouraged and supported to continue their invaluable work.


One of the core UN sustainable development goals is achieving open and secure internet connectivity for everyone. These days, access to the internet is a crucial equality issue: it’s how we communicate and disseminate information. Having certain pockets of the world locked out from that is therefore a growing human rights issue. Internet shutdowns, for example, tend to signal human rights violations.

The thing is, it’s not just about having access to the internet: it’s also about having access to an internet that supports the free flow of information. When certain information is censored, filtered or blocked, people’s rights are violated, and inequality is exacerbated. That is why human rights must be built into initiatives to roll out internet connectivity, and that schemes carrying out this important job are subject to the requisite human rights assessments.

Digital Design

While many digital technologies are built with the explicit goal of improving sustainability, technology is not universally sustainable: far from it, in fact, with many digital tools themselves being significant drivers of climate change. Mining bitcoin, for example, generates enormous amounts of carbon dioxide, while data centres gobble electricity.

Mining bitcoin, for example, generates enormous amounts of carbon dioxide, while data centres gobble electricity

Our devices, from smartphones to laptops, consume vast amounts of energy. With tech giants offering us shiny new products each year, we’re also consuming greater numbers of electronics and disposing of them, leading to a lot of unnecessary “e-waste”. This isn’t always the consumer’s fault, of course – nowadays, many devices are built to break.

As well as this, new tools that seek to increase sustainability may threaten digital rights in other aspects. Precision agriculture, for example, is being hailed as a sustainable saviour, but it uses artificial intelligence, which can pose challenges for digital rights, due to lack of transparency or bias. 

This World Environment Day, we will hopefully be able to make advances and consider new and cutting-edge solutions to the climate crisis. But as we move forward in the process and put the environment to the top of our priorities, let’s not overlook digital rights. By considering both, we can work towards a more sustainable and equitable world.

Why COVID-19 is a Crisis for Digital Rights

By Nani Jansen Reventlow, 16th April 2020

Street art of a black face mask with "COVID-19" writing

The COVID-19 pandemic has triggered an equally urgent digital rights crisis.

New measures being hurried in to curb the spread of the virus, from “biosurveillance” and online tracking to censorship, are potentially as world-changing as the disease itself. These changes aren’t necessarily temporary, either: once in place, many of them can’t be undone.

That’s why activists, civil society and the courts must carefully scrutinise questionable new measures, and make sure that – even amid a global panic – states are complying with international human rights law.

Human rights watchdog Amnesty International recently commented that human rights restrictions are spreading almost as quickly as coronavirus itself. Indeed, the fast-paced nature of the pandemic response has empowered governments to rush through new policies with little to no legal  oversight.

There has already been a widespread absence of transparency and regulation when it comes to the rollout of these emergency measures, with many falling far short of international human rights standards.

Tensions between protecting public health and upholding people’s basic rights and liberties are rising. While it is of course necessary to put in place safeguards to slow the spread of the virus, it’s absolutely vital that these measures are balanced and proportionate.

Unfortunately, this isn’t always proving to be the case.

The Rise of Biosurveillance

A panopticon world on a scale never seen before is quickly materialising.

“Biosurveillance” which involves the tracking of people’s movements, communications and health data has already become a buzzword, used to describe certain worrying measures being deployed to contain the virus.

A panopticon world on a scale never seen before is quickly materialising

The means by which states, often aided by private companies, are monitoring their citizens are increasingly extensive: phone data, CCTV footage, temperature checkpoints, airline and railway bookings, credit card information, online shopping records, social media data, facial recognition, and sometimes even drones.

Private companies are exploiting the situation and offering rights-abusing products to states, purportedly to help them manage the impact of the pandemic. One Israeli spyware firm has developed a product it claims can track the spread of coronavirus by analysing two weeks’ worth of data from people’s personal phones, and subsequently matching it up with data about citizens’ movements obtained from national phone companies.

In some instances, citizens can also track each other’s movements leading to not only vertical, but also horizontal sharing of sensitive medical data.

Not only are many of these measures unnecessary and disproportionately intrusive, they also give rise to secondary questions, such as: how secure is our data? How long will it be kept for? Is there transparency around how it is obtained and processed? Is it being shared or repurposed, and if so, with who?

Censorship and Misinformation

Censorship is becoming rife, with many arguing that a “censorship pandemic” is surging in step with COVID-19.

Oppressive regimes are rapidly adopting “fake news” laws. This is ostensibly to curb the spread of misinformation about the virus, but in practice, this legislation is often used to crack down on dissenting voices or otherwise suppress free speech. In Cambodia, for example, there have already been at least 17 arrests of people for sharing information about coronavirus.

Oppressive regimes are rapidly adopting “fake news” laws

At the same time, many states have themselves been accused of fuelling disinformation to their citizens to create confusion, or are arresting those who express criticism of the government’s response.

As well as this, some states have restricted free access to information on the virus, either by blocking access to health apps, or cutting off access to the internet altogether.

An all-seeing, prisonlike panopticon

AI, Inequality and Control

The deployment of AI can have consequences for human rights at the best of times, but now, it’s regularly being adopted with minimal oversight and regulation.

AI and other automated learning technology are the foundation for many surveillance and social control tools. Because of the pandemic, it is being increasingly relied upon to fight misinformation online and process the huge increase in applications for emergency social protection which are, naturally, more urgent than ever.

Prior to the COVID-19 outbreak, the digital rights field had consistently warned about the human rights implications of these inscrutable “black boxes”, including their biased and discriminatory effects. The adoption of such technologies without proper oversight or consultation should be resisted and challenged through the courts, not least because of their potential to exacerbate the inequalities already experienced by those hardest hit by the pandemic.

Eroding Human Rights

Many of the human rights-violating measures that have been adopted to date are taken outside the framework of proper derogations from applicable human rights instruments, which would ensure that emergency measures are temporary, limited and supervised.

Legislation is being adopted by decree, without clear time limitations

Legislation is being adopted by decree, without clear time limitations, and technology is being deployed in a context where clear rules and regulations are absent.

This is of great concern for two main reasons.

First, this type of “legislating through the back door” of measures that are not necessarily temporary avoids going through a proper democratic process of oversight and checks and balances, resulting in de facto authoritarian rule.

Second, if left unchecked and unchallenged, this could set a highly dangerous precedent for the future. This is the first pandemic we are experiencing at this scale – we are currently writing the playbook for global crises to come.

If it becomes clear that governments can use a global health emergency to instate human rights infringing measures without being challenged or without having to reverse these measures, making them permanent instead of temporary, we will essentially be handing over a blank cheque to authoritarian regimes to wait until the next pandemic to impose whatever measures they want.

We are currently writing the playbook for global crises to come

Therefore, any and all measures that are not strictly necessary, sufficiently narrow in scope, and of a clearly defined temporary nature, need to be challenged as a matter of urgency. If they are not, we will not be able to push back on a certain path towards a dystopian surveillance state.

Litigation: New Ways to Engage

In tandem with advocacy and policy efforts, we will need strategic litigation to challenge the most egregious measures through the court system. Going through the legislature alone will be too slow and, with public gatherings banned, public demonstrations will not be possible at scale.

The courts will need to adapt to the current situation – and are in the process of doing so – by offering new ways for litigants to engage. Courts are still hearing urgent matters and questions concerning fundamental rights and our democratic system will fall within that remit. This has already been demonstrated by the first cases requesting oversight to government surveillance in response to the pandemic.

These issues have never been more pressing, and it’s abundantly clear that action must be taken.

At DFF, we’re here to help. If you have an idea for a case or for litigation, please apply for a grant now.

Images by Adam Niescioruk on Unsplash and I. Friman on Wikipedia

Turning Words Into Action: What Happens After the Strategy Meeting?

By Nani Jansen Reventlow, 12th March 2020

Coloured sticky notes from the strategy meeting

This February, we hosted our third annual strategy meeting, bringing together a group of 60 actors from across Europe and beyond to discuss recent developments in digital rights, strategise on new initiatives, and also reflect on how we can better work together as a field.

Besides offering an opportunity for participants to interact in person with their peers, the strategy meeting sparks new action and informs DFF’s activities. How? Let’s take a look.

Focus Areas for Grants and Support

DFF’s thematic focus areas were determined based on the priorities expressed by the field. Following a strategy process that commenced before DFF’s formal founding, the first annual strategy meeting was an opportunity to collectively take stock of the current state of play for digital rights in Europe and what the priorities were for key actors in the field.

The mappings and conversations at the meeting were then distilled by the DFF team into three thematic focus areas for its work: privacy and data protection; the free flow of information online; and transparency, accountability and adherence to human rights standards in the design and use of technology.

The mappings and conversations at the meeting were then distilled by the DFF team into three thematic focus areas

Whether these thematic focus areas continue to reflect the field’s needs and priorities is tested on a continuous basis, through individual exchanges, litigation meetings, and developments in DFF’s grantmaking work. But the annual strategy meeting remains a key moment to test and verify that we are still providing support to the field with the right focus areas in mind.

Skill Building and Litigation Sessions

In 2018, we organised two strategic litigation retreats together with SHARE Foundation to help litigators sharpen their strategic litigation skills and develop case ideas into a concrete strategic litigation plan.

During the strategy meeting that year, participants identified a need for more skill-building and -sharing around strategic litigation: people wanted to exchange with peers on specific casework and learn from each other.

Following the strategy meeting, a number of follow-up calls and exchanges with members of our network helped determine in more detail what would be most useful, on the basis of which the programme for the retreat was created. The two retreats in 2018 were positively received, with requests for new installments, which DFF plans to organise in the fall of 2020.

A number of cases also sprung from the retreats, some of which DFF was given the opportunity to support.

Strategy meetings have also led to litigation meetings focused on specific thematic work

Strategy meetings have also led to litigation meetings focused on specific thematic work. The GDPR meeting organised by Access Now and noyb in May of 2019, sponsored by DFF, is an example, as is the training on using the competition law framework to advance digital rights, held in December 2019. DFF’s current work on developing a litigation strategy on the “digital welfare state” follows from the consultation UN Special Rapporteur Philip Alston held for his thematic report on the issue. We hosted this consultation as a side event to the 2019 strategy meeting.

The need to focus on litigation on the negative human rights impact of artificial intelligence and automated decision-making also followed from that strategy meeting and, in addition to a litigation meeting co-organised with AI Now Institute in November 2019, had concrete follow-up during the recent one.

Resources for Strategic Litigators

Several resources have been developed based on conversations at the strategy meetings. The Model Ethical Funding Policy followed from discussions on the problematic aspects of funding digital rights work. A Short Guide to Competition Law for Digital Rights Litigators was created following the explicit request made in the report-out following a small group discussion for more training and resources on using the competition law framework to advance digital rights.

We continue to listen to discussions during our strategy meetings to get a sense of what resources or tools we can help build to better support the field.

Strategic Cases

Last, but definitely not least, the strategy meetings have been an opportunity for litigators to connect and explore ways to collaborate on cases, either by sparking new ideas for future work or finding allies for existing projects.

…the strategy meetings have been an opportunity for litigators to connect and explore ways to collaborate on cases

One piece of cross-border litigation sparked from the 2018 strategy meeting, when representatives from Gesellschaft für Freiheitsrechte in Germany and in Austria had a conversation about the EU Passenger Name Record Directive. That conversation led to joint action to challenge the Directive on data protection grounds, which you can read more about here.

The NGOs leading the challenge to the SyRI risk assessment system in the Netherlands, PILP and Platform Bescherming Burgerrechten, found an ally for their case in 2019, when they invited UNSR Philip Alston to submit an amicus brief. With success: the case was recently won.

Another example is organisations working on challenging the AdTech industry, who updated the field on their ongoing work and held space for multiple conversations strategising on the topic at the 2019 strategy meeting, which then fed back into the litigation work they were developing.

What’s Next?

The February strategy meeting ventured into previously unexplored territory on a number of fronts, amongst others by looking at what we could learn from the work in other regions, such as Latin America, and exploring parallels in tactics with climate change litigation.

However, the conversations at the meeting also showed that, overall, the thematic priorities of the field remain focused in the same areas as before. This makes sense: these are not minor, short-term issues, but ones that require a sustained and concerted effort to tackle.

The thematic conversations on, amongst others, AI and human rights and the digital welfare state will feed into the dedicated convenings that will take place on these topics in the spring and summer. Further activities will surely follow from that.

And what about cases? We are certain new collaborations have emerged and fresh ideas were brought to existing projects. We look forward to highlighting them here in the months to come!