The First Steps to Decolonise Digital Rights

By Sarah Chander, 24th September 2020

In early 2020, DFF and its project partner EDRi started their joint work of initiating a decolonising process for the digital rights field in Europe. How does this fit into the current landscape of digital rights and recent developments in the movement for racial and social justice? And what have we been up to these past months? 

The Social Dilemma and other pop-culture portrayals have brought the tech industry into even sharper focus. One major concern is that this industry is a tiny, unrepresentative yet powerful, minority holding the power to impact the everyday experiences of almost the entire world. We should ask ourselves whether this is true for many of the communities working to contest Big Tech’s dominance.

We are in a moment of global shifts. Whilst racial oppression and global inequality are by no means novel phenomena, the wave of global #BlackLivesMatter protests this summer have many of us reflecting on social and racial justice with respect to our organisations, our work, and even ourselves.

Zooming in on the digital rights field, a reckoning of the need for change has also been bubbling for a long time. Following some initial discussions, DFF and European Digital Rights (EDRi) teamed up to explore how we might go about developing a process for decolonising the digital rights field. This blog is an update on our work at these early stages.

Why “decolonise” the field?

This process was conceived through various observations of the digital rights field in Europe. Over the years, we have seen a lack of representation of all people we seek to protect from harms. This undoubtedly impacts our work – specifically the extent to which the field is equipped to meaningfully defend the rights of all.

Over the years, we have seen a lack of representation of all people we seek to protect from harms. This undoubtedly impacts our work

This overwhelming lack of representation in our movement matters. The (near) absence of those who are racialised, gendered, queer and trans, working class, differently-abled, and hailing from the global south, affects our work and our perspectives more than we know. It is a flaw, a weak spot. It compromises our ability to make good on a commitment to uphold the digital rights of all in society. This absence shows up in a number of ways.

One way this unfolds is an assumption of universality with respect to the holder of digital rights – the ‘user’ of technology. Who do we envisage when we discuss defending the rights of the ‘user’? Generally, we don’t envisage anyone in particular – neutral to life circumstances and personal characteristics.

However, often when we assume universality, objectivity, and neutrality, what we do is embed dominance; we focus on the experiences that are most represented and similar to what we know. We centre our work on the assumption – in the words of Dr Ruha Benjamin – that there is a “universal user who is not racialized, is not marked by gender, sex, or class.”

This universality is a myth. Instead, taking a variety of specific lenses will illuminate how technology can in effect deepen and exploit a range of social, economic, and political vulnerabilities.

Missed opportunities

The issue of representation also undoubtedly limits our perspectives and our potential, too. In particular, the limited engagement of the European digital rights field with counterparts in the global south means we miss out on the necessary learning we need to understand the vast array of digital harms that are ongoing, their global impact, context, and their place in our collective histories.

As we contest the extractive nature of surveillance capitalism, we would gain much from placing our fight in a much longer trajectory of colonialism, with data relations a new commodity of our time.

More practically, this problem shows itself in the fruits of our labour. Even our most pivotal tools do not have answers to the most serious issues facing marginalised communities.

So far, data protection and privacy rights and the GDPR have been of limited use in protecting against group-based threats and the potential for discriminatory algorithmic profiling, for example those who may be overpoliced as the result of predictive policing tools.

So, the mechanism works for the individual who is informed and in a position to make their individual rights actionable, but less so for others, who ‘data protection’ was not modelled for. Just as we speak about harmful technologies as a result of skewed design, this argument applies to our legal tools too.

Just as we speak about harmful technologies as a result of skewed design, this argument applies to our legal tools too

These examples show us that the need for change goes beyond the need to simply adjust the composition of people in the room. It’s about ensuring that all of our stories, realities and the ‘multiple kinds of expertise’ we bring are reflected and valued in the field – its tools, its work, its approach. There is a growing, intuitive knowledge that a change is overdue for the digital rights field.

First steps

How to go about something that sounds so huge? So far, we have approached cautiously questions around a decolonising process for the digital rights field. What does it mean? How may we achieve it? Who else do we need to be talking to?

Taking baby steps, we started by speaking to organisations, collectives, activists, and others currently outside the digital rights field to understand how they engage with digital rights issues. How do organisations working on workers’ rights, LGBT, anti-racist, or disability rights see digital rights and the field itself? Do they see the links with their work? How do they understand the concept of decolonisation? What processes of change have they seen work?

So far, the project has been met with encouraging levels of enthusiasm. Over 30 individuals and organisations have taken the time to discuss these questions with us. What we already see is that there is huge appetite and potential from activists and civil society working outside of the digital rights field to engage, to learn more, and to establish connections with their work and “digital rights”.

What we already see is that there is huge appetite and potential from activists and civil society working outside of the digital rights field to engage

We’ve discussed and questioned many things – from the colonialism of the human rights framework, the relation of this to justice, and the limits of the “diversity and inclusion” approach. This thinking will feed into our further work.

Now we are starting conversations with the digital rights field. We hope to get a picture of the different visions, sensitivities, and understandings within the field. What is the impact of representation on our work? How may we address this? At what stages of the process are different actors in the field?

The next step is to bring interested stakeholders together in an (online) gathering to get insight into how we may go about designing a decolonising process for the digital rights field. We are excited – this will be the first time those who have interacted with the project will come together, and we hope to benefit from the range of different perspectives, build on what we have already learned and develop concrete next steps for the design process.

The next step is to bring interested stakeholders together in an (online) gathering to get insight into how we may go about designing a decolonising process

We know that this project itself cannot dismantle the world’s uneven power relations – but we hope to do what we can from our corner.

Sarah Chander is Senior Policy Advisor at European Digital Rights (EDRi). She leads EDRi’s policy work on artificial intelligence and connecting digital rights with wider movements for equality and justice.

Challenging the Google-Fitbit Merger through Competition Law

By Ioannis Kouvakas, 17th September 2020

Graphic of Fitbit-like watch with the Google logo on the screen

At Privacy International, we’re concerned by the market trend of companies accumulating vast amounts of data for exploitation by buying other companies that already have vast amounts of data primed for exploitation.

This allows these larger firms to further entrench their market dominance, and to dictate future innovation built upon the exploitation of our data. Fortunately, competition regulators are starting to agree with us.

In November 2019, Google announced their plan to acquire Fitbit, a company that produces and sells health tracking technologies and wearables.

The proposed acquisition raised a series of concerns with regard to its implications for consumers’ data privacy rights, considering the vast amounts of sensitive personal data held by Fitbit, which can include details of individuals’ heartbeats, calorie intake, walking distances, sleeping patterns, and health conditions.

…the vast amounts of sensitive personal data held by Fitbit can include details of individuals’ heartbeats, calorie intake, walking distances, sleeping patterns, and health conditions

At the same time, the transaction poses the more general question of whether data should form part of competition regulators’ assessment of acquisitions.

Two regulators, the Australian Competition and Consumers Commission (ACCC) and the European Commission’s Directorate-General for Competition, have indicated in their preliminary findings that they agree there is indeed a strong interplay between data and competition.

How Fitbit data boosts Google’s services – and their power

The exploitable value of personal data increases as more and more data is combined.

This incentivises companies to pursue business strategies aimed at collecting as much data as possible.

The acquisition of vast quantities of data is what allows companies like Google to make billions of dollars each year via targeted advertising. In 2019, for example, Google’s parent company, Alphabet, generated 83% of its $161.86 billion in revenue from targeted advertisements to users of their consumer-facing services, including the Android operating system, Google Search, YouTube, and many others.

A big part of Fitbit’s value is said to lie in the quality of the health data it possesses. The company’s technologies can track individuals’ daily steps, distance walked or travelled, calories burned, sleep patterns and heart rate. In 2018, Fitbit also introduced ‘female health tracking’ to track menstruation cycles and fertility windows.

A big part of Fitbit’s value is said to lie in the quality of the health data it possesses

Recently, Fitbit further increased its health-related database and tracking capabilities by acquiring a number of other actors on the health tracking and wearables market, including FitStar, Pebble, Vector and Twine Health. Some of these acquisitions include partnerships with health insurers.

The importance of such a vast data holding is very well-recognised by tech giants like Google, who consistently seem to regard consumers’ data as a business asset.

This asset is all the more valuable when a digital service provider is able to combine data from multiple sources, including across multiple services or platforms. For example, a 2018 paper by academics at the University of Oxford outlined the prevalence of third-party trackers on almost 1 million apps from the US and UK Google Play stores. The researchers found that most apps contain third party tracking, with Google present on 87.57% of apps tested.

The researchers found that most apps contain third party tracking, with Google present on 87.57%

Taking into consideration both the amount and sensitivity of Fitbit’s data, the proposed acquisition would further entrench Google’s existing significant market power in, among others, the search and digital advertising markets. This could be achieved by potentially merging Fitbit’s customer data and/or datasets with the ones held by Google, allowing the latter to enrich the extensive datasets and detailed consumer profiles it holds with sophisticated real time data about individuals’ health conditions and needs, as well as general information about their daily behaviour and bodily rhythms.

In other words, the Fitbit data would provide Google with an opportunity to, among other things, better map general search queries originating, for instance, from an extremely specific geographic area/location, or be able to offer advertisers ever more valuable insights into specific audiences by allowing the targeting of the latter based on health conditions, activity level, and emotional attributes.

Assessing a company’s data holding is therefore not solely a matter for data protection regulators, it also needs to be considered by competition regulators in their assessment of mergers in the digital economy sector.

The interplay between data and competition

While concentrations of data might traditionally have been seen as falling outside of competition regulators’ remit, we are part of a new emerging consensus that it should belong within the scope of an assessment of an undertaking’s market dominance. As the German competition authority (Bundeskartellamt) noted in the first decision of this nature, in February 2019, against Facebook:

Monitoring the data processing activities of dominant companies is […] an essential task of a competition authority, which cannot be fulfilled by data protection officers.

The German Federal Court of Justice (Bundesgerichtshof), which upheld the Bundeskartellamt‘s findings in its judgment of 23 June 2020, found that Facebook abuses its dominant position by withholding options for users to limit the use of their data for personalisation of both Facebook content and advertisements on third-party websites that use Facebook’s digital advertising tools. As such, privacy is explicitly recognised as a parameter of competition; effective competition in the social media network market would result in privacy safeguards for users, and it is within the remit of a competition authority to act in response to anticompetitive and privacy-infringing conduct.

…effective competition in the social media network market would result in privacy safeguards for users

In the context of its review of the Google/Fitbit merger, in June 2020, the ACCC published its Statement of Issues (SOI) outlining preliminary competition concerns. The SOI suggests that the vast amounts of personal data Google would be getting access to was one of the key factors considered by the regulator. For example, underlining the unique and sensitive nature of the data held by Fitbit, the ACCC notes:

The accumulation of additional, individual user data via this transaction in an entity which already benefits from substantial market power in multiple markets may contribute to reduced competitive outcomes in the future.
 

A similar approach, when it comes to the importance of personal data in assessing the implications of the merger, is followed by the European Commission. Following the end of its initial investigation of the proposed acquisition, on August 4, the EU regulator announced the opening of a more extensive, in-depth review of the transaction. Among their primary concerns was the fact that by relying on Fitbit data, Google would be gaining “an important advantage in the online advertising markets”.

The way forward: advancing digital rights through competition law

The announcements by the two competition regulators discussed above should be welcomed as a progressive step to adapt competition law frameworks to digital economies of scale.

More importantly, these decisions could, first, pave the way towards a competition regime that can better encompass people’s rights, by, for example, ensuring that effective competition exists also when it comes to privacy standards offered by companies.

In data-intensive digital markets, companies that occupy dominant positions have very little incentive to adopt business models or practices that enhance consumers’ privacy.

…companies that occupy dominant positions have very little incentive to adopt business models or practices that enhance consumers’ privacy

Google’s acquisition of Fitbit would further reduce any competitive pressure on Google to compete on these non-price (i.e. quality, privacy) aspects, since the acquisition would further entrench Google’s dominance and preclude the possibility of competition from another entity acquiring or partnering with Fitbit to compete with Google in this space. A competition intervention in this case could potentially prevent any data exploitation practices (e.g. through the use of so-called “dark patterns“, namely features meant to trick users into privacy-intrusive settings) that would impose more intrusive terms as regards data collection and have negative connotations for consumers.

Second, the approaches adopted by the ACCC and the European Commission might signal further opportunities for civil society to intervene and support regulators with their digital rights expertise. Such opportunities, arising also in the context of the Google/Fitbit merger, were actively discussed during DFF’s June 2020 virtual event, which focused on competition law and actions involving data.

Civil society involvement is contributing to the outcome of this debate. Privacy International has made submissions before both the ACCC and the European Commission, and has been granted interested third person status by the latter. Similarly, in July 2020, IDEC, the Brazilian Institute of Consumer Protection, officially requested that the Brazilian Antitrust Authority (CADE) formally scrutinize this merger in Brazil.

We welcome the opportunity to assist regulators in their review of a merger. We are hopeful that they will seize the unique opportunity they have to also advance peoples’ rights by sending a strong message against data exploitation practices that seek to harm consumers’ well-being in the digital age.

Google-Fitbit is just another instance of what otherwise will be a growing trend. Both the ACCC and the European Commission are expected to announce their decision by 9 December 2020. We have a lot of work to do between now and then.

Ioannis Kouvakas is a legal officer at Privacy International (PI). He also leads PI’s work on challenging data dominance.

Photo by Morning Brew on Unsplash

Here We Go Again: Our COVID-19 Litigation Fund Takes Two

By Thomas Vink, 16th September 2020

A smartphone that displays DFF Covid-19 litigation fund is surrounded by floating viruses.

As the pandemic blazes on, DFF has reopened applications for its COVID-19 Litigation Fund.

The second round comes on the heels of the first, which saw applicants seeking to challenge digital rights violations across the spectrum, from the growing use of thermal scanners to the security of tracing apps.

In June 2020, DFF launched the COVID-19 Litigation Fund. The fund supports strategic litigation that challenges digital rights violations committed in the context of the COVID-19 pandemic.

We established this fund as it became increasingly evident that the pandemic was not only a public health crisis, but also a crisis for digital rights. The scope and nature of the digital rights violations that have been caused by responses to the COVID-19 pandemic are unprecedented and need to be challenged as a matter of urgency.

The scope and nature of the digital rights violations that have been caused by responses to the COVID-19 pandemic are unprecedented and need to be challenged as a matter of urgency

The fund aims to ensure that activists and litigators have the resources to start bringing legal challenges now to halt or limit the impact of digital rights-infringing measures during this time.

What cases are we supporting?

The cases we are currently supporting under the fund include challenges to the growing use of thermal scanning technology in the UK, data protection and privacy claims against the rolling out of COVID-19 apps across Europe, and litigation on the right of women to have access to online sexual and reproductive health information and services in Spain.

Check out the case study page on our website to find out more about the different cases we are supporting through the COVID-19 Litigation Fund.

Although many governments have ended the most restrictive lockdown measures, ongoing responses to the pandemic continue to threaten our digital rights. The list of these threats is long, but includes the growing use of facial recognition technology, the roll out of immunity passports and travel apps, the use of tools to recognise whether people are wearing masks, the use of algorithms to prioritise hospital appointments, and tools to monitor quarantine breaches.

Although many governments have ended the most restrictive lockdown measures, ongoing responses to the pandemic continue to threaten our digital rights

Strategic litigation is an important tool for upholding people’s right to privacy and data protection, ensuring equality of access to information online and pushing back when technology does not meet human rights standards.

Early litigation successes include the banning of surveillance drones in France and preventing the tracking of cell phones in Israel. More recently, in the UK, the threat of legal action helped force the government to scrap an unfair grading algorithm that was being used to provide students with academic qualifications in the absence of exams. In Brazil and Slovakia there have been important court rulings that limit the sharing of telecommunications data to track contacts of people infected with the virus.

A second call for applications

To help support more organisations in taking legal action against digital rights violations exacerbated by the pandemic, we have launched a second call for applications under the COVID-19 Litigation Fund. The second call will be open until 30 September, and grants will be contracted with successful applicants by December 2020.

Recognising that responses to the pandemic can have a particularly severe impact on groups that are already experiencing discrimination and marginalisation, DFF will prioritise applications that focus on addressing the negative impact felt by the most vulnerable groups in society.

DFF will prioritise applications that focus on addressing the negative impact felt by the most vulnerable groups in society. Funding is not limited to digital rights organisations

Funding is not limited to digital rights organisations. We also encourage applications from other organisations, where digital rights violations have occurred in the context of other work, such as health, social justice or welfare.

If you have a case challenging digital rights violations related the COVID-19 pandemic, we encourage you to apply.