What Decolonising Digital Rights Looks Like

By Aurum Linh, 6th April 2020

Decolonisation is core to all of our work as NGOs and non-profits. We are striving to create a future that is equitable and just. To do that, we need to dismantle the systems of racism, anti-blackness, and colonisation embedded in every aspect of our society.

This is a particularly urgent conversation to have in the digital rights field, given the belief that technology will liberate us from these biases. In reality, we can see that it is deepening these divides and automating these systems of oppression.

We can’t decolonise something if we don’t know what colonisation is. In per TEDx talk, “Pedagogy of the Decolonizing”, Quetzala Carson explains what colonisation is and how deeply it is embedded in nearly every aspect of our lives: “Colonisation is when a small group of people impose their own practices, norms and values. They take away resources and capacity from indigineous people, often through extreme violence and trauma.”

Quetzala goes on to explain that colonisers also bring their axiology, which is how things are quantified and how value and worth are prescribed to things. They impose their assessment of the value of the people, resources and land that become embedded in the institution that then creates the nation-state in settler colonialism. All of the established laws, policies, institutions, and governance structures are based on those beliefs that were brought upon contact.

They impose their assessment of the value of the people, resources and land that become embedded in the institution

How we conceive and transfer knowledge, as well as what knowledge we see as credible and valid (known as epistemology) are also based on these colonial beliefs. How we exist within these structures and how we interpret reality (known as ontology) is deeply influenced by colonisation as well. Axiology, epistemology, and ontology all come together for the state to create a narrative – to create how “normal” is defined.

This is why it’s so uncomfortable and painful to have these conversations – because these structures and beliefs have been embedded in our own hearts and minds. To have these beliefs challenged feels like an attack on our own being, but we have to remember that these beliefs were taught to us and are deeply embedded within us by design. 

Nikki Sanchez, an indigenous scholar, recommends that decolonisation is giving up social and economic power that disempowers, appropriates, and invisibilises others; dismantling racist and anti-black structures; dismantling the patriarchy; finding out how you benefit from the history of colonisation and activating strategies that allow you to use your privilege to dismantle that; and building and joining communities that work together to build more equitable and sustainable futures.

Decolonising must first happen within ourselves – decolonising our own hearts and minds

Decolonising must first happen within ourselves – decolonising our own hearts and minds. It is necessary to both actively combat and resist systems of oppression on the outside, but also within ourselves.

DFF held a strategy meeting in February of this year and there were two sessions on decolonisation (one of which I facilitated) that resulted in the following strategies being shared:

  • Unlearn and re-educate yourself.
  • Acknowledge your privilege and use it to dismantle the system from that position of power within the system.
  • Actively start conversations with people about privilege, decolonisation and anti-racist work.

At the organisational level, how can you give people tools to reflect and engage with this concept in a meaningful and critical way? How might we make it part of the culture of the organisation itself, embedded within every aspect of the organisation, as opposed to something that is considered an add-on or nice-to-have? How can decolonisation be the flour (vital to the recipe), and not the icing (an add-on)?

Culture is cultivated. Participants at the strategy meeting brainstormed a number of practical steps that could be taken at the organisational level to cultivate their decolonised culture. Some examples of the organisational measures suggested include:

  • Create a common language around decolonisation, and make publicly questioning the influence of biases and privilege part of your organisational culture.
  • Remember that this work is more than just hiring people of colour (PoC) and significant work is required of a mainly white organisation before bringing in someone of colour. Otherwise it could put that person in a position of having to educate others and endure traumatic conversations.
  • Learn what white fragility is, and be aware and conscious of when white fragility is arising in conversations.
  • Ask yourselves “are we the right people to be doing this work?” and “are we taking resources from other people or organisations that have been doing this work?”
  • Only put the necessary qualifications on job descriptions – women and PoC are less likely to apply to jobs if they don’t meet every qualification listed. Be conscious of this.
  • If no one on your team is part of the marginalised community you are working to protect, acknowledge that your organisation is coming from a place of allyship. Do not act like stakeholders when you are not part of the community that you are trying to protect and ask yourselves (again) “are we the right people to be doing this work?” and “how can we provide resources to the community members who are doing this work?”
  • Recognise your blind spots on issues of power asymmetries both within and between private and state actors.
  • Pay a liveable salary – people are often financially responsible for others (their parents, siblings, etc.) and can’t afford to live on a low income.
  • Avoid tokenism. Does everyone truly have a seat at the table or are some people there as (or made to feel like) figureheads for “diversity” purposes?
  • Consider what problems get solved first at your organisation. Who decides what? There is space here to rethink and/or dissolve structural hierarchies.
  • Set clear standards to cultivate inclusive meetings by design. For example, rules to prohibit interrupting others, creating space for pointing out problematic behaviour.
  • Restructure how you measure impact and work, and recognise “invisible work” like mentorship.

The effects of colonisation are deeply internalised in nearly every aspect of our waking lives

Colonisation is a collective history that connects us all. The effects of colonisation are deeply internalised in nearly every aspect of our waking lives. What is your personal role in this healing? What role can your organisation play in actively decolonising the digital rights space and beyond? Ultimately, all of these actions create a collective movement towards healing, justice, and dismantling systems of oppression. 

Aurum Linh is a technologist and product developer embedded as a Mozilla Fellow within the Digital Freedom Fund.

Image by Omar Flores on Unsplash

A case for knowledge-sharing between technologists and digital rights litigators

By Aurum Linh, 6th December 2019

“Almost no technology has gone so entirely unregulated, for so long, as digital technology.”

Microsoft President, Brad Smith

Big technology companies have become powers of historic proportions. They are in an unprecedented position of power: able to surveil, prioritize, and interfere with the transmission of information to over two billion users in multiple nations. This architecture of surveillance has no basis for comparison in human history.

Domestic regulation has struggled to keep pace with the unprecedented, rapid growth of digital platforms: who operate across borders and have established power on a global scale. Regulatory efforts by data protection, competition and tax authorities worldwide have largely failed to obstruct the underlying drivers of the surveillance-based business model.

It is, therefore, vital that litigators and technologists work together to strategize on how the law can be most effectively harnessed to dismantle these drivers and hold those applying harmful tech to account. As a Mozilla Fellow embedded with the Digital Freedom Fund, I am working on a project that I hope can help break down knowledge barriers between litigators and technologists. Read on to find you how you can get involved too.

Regulating Big Tech

The surveillance-based business models of digital platforms have embedded knowledge asymmetries into the structure of how their products operate. These gaps exist between technology companies and their users, as well as the governments that are supposed to be regulating them. Shoshana Zuboff illustrates this in The Age of Surveillance Capitalism, where she observes that “private surveillance capital has institutionalized asymmetries of knowledge unlike anything ever seen in human history. They know everything about us; we know almost nothing about them.”

Zuboff makes the case that the presence of state surveillance and its capitalist counterpart means that digital technology is separating the citizens in all societies into two groups: the watchers (invisible, unknown and unaccountable) and the watched. Their technologies are opaque by design and foster user ignorance. This has debilitating consequences for democracy, as asymmetries of knowledge indicate asymmetries in power. Whereas most democratic societies have at least some degree of oversight of state surveillance, we currently have almost no regulatory oversight of its privatised counterpart.

In essentially law-free territory, Google has developed its surveillance-based, rights-violating business model. They digitised and stored every book ever printed, regardless of copyright issues. They photographed every street and house on the planet, without asking anyone’s permission. Amnesty International’s recent report, Surveillance Giants, highlights Google and Facebook’s track record of misleading consumers about their privacy, data collection, and advertising targeting practices. During the development of Google Street View in 2010, for example, Google’s photography cars secretly captured private email messages and passwords from unsecured wireless networks. Facebook has acknowledged performing behavioural experiments on groups of people— lifting (or depressing) users’ moods by showing them different posts on their feed. Furthermore, Facebook has acknowledged that it knew about the data abuses of political micro-targeting firm Cambridge Analytica months before the scandal broke. More recently, in early 2019, journalists discovered that Google’s Nest ‘smart home’ devices contained a microphone they failed to inform the public about.

We can see these asymmetries mirrored in the public sector too. ProPublica’s examination of the COMPAS algorithm is a clear example of biased algorithms being used by the state to make life-changing decisions in people’s lives. The algorithm is increasingly being used nationwide in pre-trial and sentencing, the so-called “front-end” of the criminal justice system, and has been found to be significantly biased against Black people.

A high-profile case was that of Eric Loomis, who courts sentenced to the maximum penalty on two counts after reviewing the predictions derived from the COMPAS risk-assessment algorithm, despite his claim that using a proprietary predictive risk assessment in sentencing violated his due process rights. The Wisconsin Supreme Court dismissed the due process claims, effectively affirming the use of predictive assessments in sentencing decisions. Justice Shirley S. Abrahamson noted, “this court’s lack of understanding of COMPAS was a significant problem in the instant case. At oral argument, the court repeatedly questioned both the State’s and the defendant’s counsel about how COMPAS works. Few answers were available.”

In How to Argue with an Algorithm: Lessons from the COMPAS ProPublica Debate, Anne L. Washington notes, “[b]y ignoring the computational procedures that processed the input data, the court dismissed an essential aspect of how algorithms function and overlooked the possibility that accurate data could produce an inaccurate prediction. While concerns about data quality are necessary, they are not sufficient to challenge, defend, nor improve the results of predictive algorithms. How algorithms calculate data is equally worthy of scrutiny as the quality of the data themselves. The arguments in Loomis revealed a need for the legal scholars to be better connected to the cutting-edge reasoning used by data science practitioners.”

In order to meaningfully change the context that allows surveillance-based, human rights violating business models to thrive in the tech sector, lawmakers need to deeply understand what legal requirements will change its fundamental structure for the better. This is rendered near impossible since the tech ecosystem is designed with multiple layers of information opaqueness. One key asymmetry is between lawmakers and litigators and the people who are building the technologies that they are attempting to regulate. The technology industry has become so specialized in its practice, yet so broad in its application, the knowledge gaps cause the regulation to be surface-level and ineffective when measured in terms of impact on the underlying system that allowed ethical violations in the first place. When governments, courts, or regulators do get involved with disciplining these companies, the consequences do not actually hurt them. It does not affect the circumstances that caused the violation, nor does it fundamentally change its structure of operations or influence. An example of this can be found in Amnesty’s recent report;

“In June 2019, the US Federal Trade Commission (FTC) levied a record $5bn penalty against Facebook and imposed a range of new privacy requirements on the company, following an investigation in the wake of the Cambridge Analytica scandal. Although the fine is the largest recorded privacy enforcement action in history, it is still relatively insignificant in comparison to the company’s annual turnover and profits – illustrated by the fact that after the fine was announced, Facebook’s share price went up. More importantly, the settlement did not challenge the underlying model of ubiquitous surveillance and behavioural profiling and targeting. As FTC Commissioner Rohit Chopra stated in a dissenting opinion ‘The settlement imposes no meaningful changes to the company’s structure or financial incentives, which led to these violations. Nor does it include any restrictions on the company’s mass surveillance or advertising tactics.’”

This architecture of surveillance has no basis for comparison in human history. Lawmakers struggle to grasp how its technology works and which problems need to be addressed. Inaction does not reflect a lack of will, so much as a lack of sharing knowledge between bodies of expertise. This system spans entire continents and touches at least a third of the world’s population, yet has gone relatively unregulated for 20 years. There is an urgent need for technologists that can break down the barriers of knowledge that keep meaningful legal action from being taken.

A Project to Facilitate Knowledge-Sharing & Knowledge-Building

In partnership with Mozilla and the Digital Freedom Fund, we are building a network of technologists, data scientists, lawyers, litigators, and digital rights activists. Jonathan is a lawyer based in London who is collaborating with me on this project. You can read his perspective on this project here. With the help of this network, we would like to create two guides that can build the knowledge and expertise of litigators and technologists when it comes to each other’s disciplines, so they can collaborate and coordinate effectively on cases that seek to protect and promote our human rights while holding the “watchers” to account.

If you are a digital rights activist, technologist, or lawyer, you can contribute to this project by taking this survey and getting in touch with us. Otherwise, you can help by sharing these blog posts with your networks. We look forward to hearing from you!