The Ongoing Digitisation of Europe’s Borders
Europe’s militarised borders and increasingly restrictive migration policies in the wake of the COVID-19 pandemic are forcing migrants seeking EU asylum to take treacherous travel routes to reach the continent, only to be pushed back across key crossing points when they get there.
Technology plays a central role in creating this lethal border, with digital tools of surveillance, information extraction, and registration being used to keep migrants out, and control those who do enter the EU.
This use of technology to police and control migrants not only fortifies Europe’s territorial borders, but also produces digital borders across and beyond the EU bloc, hindering refugees’ access to resources, services, and asylum.
Europe’s digital borders
As the UN special rapporteur on contemporary forms of racism explained in her 2020 report, emerging digital technologies used by governments at their borders have a uniquely discriminatory impact on migrants, stateless persons, refugees, and other non-citizens.
…the use of digital border technology continues to reinforce dominant social and political trends under the veil of neutrality
With the rise of ethnopopulist nationalism in many European countries, and the widespread perception of migrants and refugees as threats to national security, the use of digital border technology continues to reinforce dominant social and political trends under the veil of neutrality.
The discriminatory impact of digital technology on migrants and refugees begins before they reach the EU’s territorial borders. Indeed, at EU borders, where migrants are often policed rather than supported, technological approaches are often preferred over those that involve human interaction.
For example, the increased EU funding received by EU coast guard and border agency Frontex is being channeled into surveillance drones watching migrants in distress in the EU’s seas, rather than patrol ships capable of rescue. Overwhelming suspicions exist that Frontex uses this surveillance technology to oversee illegal pushbacks of migrants before they can apply for asylum, and to support its existing air operations, which have been linked to thousands of migrants in the Mediterranean being taken into the custody of the Libyan coast guard and returned to detention camps in Libya.
Beyond its surveillance mechanisms, the EU is increasingly seeking to rely on artificial intelligence programs to screen and monitor people before they arrive at its territorial borders.
In 2019, it trialed an artificial-intelligence driven lie detector test, iBorderCtrl, in Greece, Latvia and Hungary as part of a EUR 4.5 million EU-funded research project. The program confronts travelers with an animated border guard who asks them questions via webcam, while AI software analyses their “micro-gestures” for “biomarkers of deceit.”
The program confronts travelers with an animated border guard who asks them questions via webcam, while AI software analyses their “micro-gestures” for “biomarkers of deceit.”
The lie detector test is designed to help border guards in “spotting illegal immigrants” and preventing crime and terrorism, by pre-screening travelers for high and low risk categorisation before they reach the border. But the technology has been found to produce false negatives, and experts have described lie detection based on non-verbal signals and facial micro-gestures made during questioning as pseudoscience.
In this particular case, the technology appears to read signs of stress through fidgeting or subtle facial movements as a proxy for dishonesty, posing the risk of a disproportionate impact on marginalised communities, like migrants and refugees whose previous, potentially traumatic, experiences likely pre-dispose them to display exactly those signals.
Prevailing attitudes often also mean that migrants are already coded as criminal in the eyes of immigration officials. The technology will likely serve to reinforce that impression.
…the lie detector’s accuracy was tested using predominantly white European men, despite evidence that facial recognition software is embedded with racist tropes and stereotypes
Importantly, the lie detector’s accuracy was tested using predominantly white European men, despite evidence that facial recognition software is embedded with racist tropes and stereotypes, misidentifying Black women twenty times more often than white men. This raises further questions about its efficiency, including whether it is actually capable to account for ethnic and cultural differences in body language. Taken together this means that, if fully rolled out, iBorderCtrl would constitute a significant barrier to refugees seeking asylum in the EU by marking them as suspect in the eyes of the state long before they arrive at any physical checkpoint.
It will also render their emotional world something to be controlled by technology: an intrusion of their right to privacy that goes beyond anything previously seen.
A transparency lawsuit seeking the release of documents on the ethical evaluation, legal admissibility, marketing and results of the project is currently underway. Additionally, civil society organization Homo Digitalis, who work on protecting digital rights in Greece, have sought to challenge the use of the iBorderCtrl system at Greek borders.
While concerns over data privacy have grown amongst EU citizens in recent years, governmental and humanitarian biometric data collection from refugees and migrants bears the additional risk of putting already-marginalised groups in greater danger of having their personal data used against them.
While surveillance programs like the European Border Surveillance System acquire refugees’ data without their knowledge, even personal information they knowingly provide when seeking asylum is often taken without obtaining informed consent. Without being told how their personal data will be stored and shared, many of Europe’s refugees and asylum-seekers face the prospect of their personal data subsequently being used by law enforcement to police, detain, and deport them.
…many of Europe’s refugees and asylum-seekers face the prospect of their personal data subsequently being used by law enforcement to police, detain, and deport them
Similarly, expansive laws in many parts of Europe allow immigration officials to extract data from migrants’ phones, enabling them to use this information for deportations.
These data privacy concerns are not limited to the EU. Just this month, Human Rights Watch revealed that the UN High Commissioner for Refugees (UNHCR) had shared personal information collected from ethnic Rohingya refugees in Bangladesh with the Bangladesh government without obtaining refugees’ informed consent. Bangladesh, in turn, shared the information with Myanmar, from which over 800,000 Rohingya have been expelled or have fled persecution since 2016, to verify which refugees should be selected for possible repatriation.
This case highlights how, as a result of invasive data collection practices, refugees who enter the territory of a host country still face the precarity of its digital borders.
Dismantling digital borders
Numerous human rights groups, NGOs and refugee rights communities are organising to resist the violations of human rights at the EU’s borders, and fight back against the use of invasive technologies on migrants and refugees within the bloc.
The Border Violence Monitoring Network tracks illegal pushbacks, police violence and other human rights abuses by EU member states at the EU’s external borders, including the use of digital technology in border violence. Their database and monthly reports act as a critical advocacy tool for holding EU states and the European Parliament accountable for the violence at their borders.
Migreurop, a network of international NGOs, activists and researchers, reports on the conditions and experience in migrant camps throughout Europe, as well as the treatment of people along migratory routes.
Groups like Abolish Frontex, along with Equinox EU, are calling for the dismantling of the border industrial complex, the abolishment of Frontex, an allocation of public resources toward public services, and a border policy that centers migrant safety and free movement.
On a domestic level, civil liberties organisations like Liberty in the UK are campaigning to stop the UK’s sharing of personal data with immigration officials and the use of mobile fingerprint scanners by law enforcement. While the UK is no longer an EU member state, it has adopted similar practices to those of the EU as part of the UK’s hostile environment policy that seeks to drive migrants and refugees out of the UK or facilitate their deportation.
In Berlin, the feminist, anti-racist political group International Women* Space—a coalition of migrant, refugee and non-migrant women*—document and fight everyday violence, racism and sexism in Germany. Their Lager Reports, initiated after the onset of the COVID-19 pandemic, provide on-the-ground documentation of the experiences and conditions of women living in refugee accommodation centers in Berlin. Their most recent report documents the lack of internet access in certain German Lager (“camps”). All these organisations are working to dismantle the digital borders that encroach on the everyday lives of migrants and refugees within European states.
Challenging digital borders
In addition to the work being carried out by activist and advocacy groups, there are promising legal challenges being made and won in European courts, vindicating the digital rights of refugees.
The case challenged the Federal Office for Migration and Refugees’ (BAMF) practice of searching asylum-seekers personal phones as part of their asylum-application process
The German NGO, Society for Civil Rights (GFF) recently won a lawsuit on behalf of an asylum-seeker against the German government. The case challenged the Federal Office for Migration and Refugees’ (BAMF) practice of searching asylum-seekers personal phones as part of their asylum-application process. Under a law passed in 2017, German officials have the authority to examine the mobile phones of asylum-seekers if they do not have valid documents to prove their identity. The administrative court’s ruling earlier this month found BAMF’s phone searches and storage of information obtained during such searches to be unlawful. While this marks the first decision in three separate lawsuits brought by asylum-seekers in Germany last year, the ruling could have far-reaching implications for protecting the privacy and safety of migrants and refugees in Germany.
In May, Open Rights Group and campaign organisation the3million won a lawsuit challenging the “immigration exemption” to the UK’s 2018 Data Protection Act. Under the exemption, the UK’s Home Office and other organisations involved in immigration control are allowed to refuse data access requests for personal data if a data controller believes the disclosure of that data would undermine or prejudice “the maintenance of effective immigration control.”
As a result, asylum applicants, along with those seeking immigration visas, whose applications are denied were overwhelmingly refused access to the records used in their cases, creating another barrier to securing asylum. The appeal court ruling—that the immigration exemption is incompatible with the GDPR and the EU Charter of Fundamental Rights—means applicants will now have access to the records used to decide their cases. This will force more transparency and accountability from the Home Office in its decision-making process.
This puts applicants without internet access at risk of becoming undocumented and losing access to public services and benefits
A recent legal victory in France is also chipping away at the digital borders affecting migrants. Under a French administrative process, foreigners seeking to obtain or renew residence permits can only do so by making an appointment through an online booking system. This puts applicants without internet access at risk of becoming undocumented and losing access to public services and benefits. In practice, the service was saturated, making it almost impossible to get an appointment. An administrative court in Rouen ruled in February that its prefecture could not impose an online-only booking systems without a non-digital alternative for applicants. The fight to have other French prefectures follow suit still continues.
The fight continues
While new and potentially harmful technologies create digital barriers to migrants’ rights within the EU, these victories above illustrate how the digitisation of Europe’s borders can be checked by meaningful strategic litigation and advocacy.
…it is becoming clear that borders are not only territorial boundaries that circumscribe a state
With the increasing use of digital technology to reinforce and support Europe’s border policy, it is becoming clear that borders are not only territorial boundaries that circumscribe a state. The digital barriers to benefit-access and secure immigration status, the invasive use of data extraction and processing, and increasingly sophisticated technologies of surveillance and control constitute digital borders that go beyond the physical territory of a border checkpoint.
However, challenging the invasive digital tools is only half the battle. The hostile migration policies that are at the heart of these serious human rights violations and are responsible for the loss of thousands of lives need to be dismantled as well.
Adil Habib is a 2L at Harvard Law School and a 2021 summer intern at the Digital Freedom Fund.