Against Mass Surveillance of Air Passengers

By Bijan Moini, 26th March 2020

Passengers walking through an airport

Since 2018, air passengers in the European Union are subject to mass surveillance and algorithmic profiling.

 

Despite being in violation of the Charter of Fundamental Rights of the European Union (CFR), EU Member States collect, store and analyse comprehensive data on air travel.

 

The German Society for Civil Rights (Gesellschaft für Freiheitsrechte, GFF) and the Austrian NGO epicenter.works, with the support of DFF, filed a series of strategic lawsuits aimed at reaching the Court of Justice of the European Union (CJEU). A German court has now referred several cases to the CJEU, bringing us closer to toppling the directive that provides the legal basis for mass surveillance of flight passengers.

 

We are used to being treated differently when traveling by air compared to journeys by, say, train. Flying involves identity checks, manual body searches or even full body scans, and sometimes interviews. Many of the precautionary measures taken at airports today were first introduced by the US in the aftermath of the 9/11 terrorist attacks. However, the measures visible to us as air passengers are only the tip of the iceberg.

 

…the measures visible to us as air passengers are only the tip of the iceberg

The collection of data on European air travel presents a similarly severe interference in our fundamental rights.

 

For all passengers of flights to or from the EU, a Passenger Name Record (PNR) is transferred from air carriers to the national police. The basis for all of this is the European PNR Directive (Directive 2016/681). A PNR contains up to 20 data items, including the date of birth, details of accompanying persons as well as payment information or the IP address used for online check-in.

 

Together with the information on the flight itself, the PNR offers a detailed picture of the passenger. Sound like mass surveillance to you? It sure is.

How Did it Come to This?
 
The US first introduced mass retention of passenger data in November 2001. Other countries, including Canada and Australia, planned to introduce similar measures. Despite some fierce opposition in the EU Parliament based on data protection concerns, the EU concluded PNR agreements with these countries between 2011 and 2014.
 

This put mass surveillance of flight passengers on a legal footing. In the case of the PNR agreement between the EU and Canada, however, the CJEU in July 2017 confirmed that the provisions violated European fundamental rights.

In the course of negotiating these agreements, the EU repeatedly discussed introducing its own data retention system for airline travel data.  The PNR Directive was adopted in 2016 and has since been gradually implemented in EU Member States.

 

Even though the Directive envisages data retention only for flights entering and exiting the EU, all Member States have agreed to an extension clause, according to which data for intra-European flights will be stored, as well.

 

Dangerous Data Collection

 

Under the PNR Directive, European government agencies can store PNR data for six months in an identifiable manner (i.e. linked to the real name of the passenger) and then another four and a half years in pseudonymised form.

 

They automatically check the passenger data against databases on, for example, wanted persons and stolen passports. But more importantly, they can apply algorithms, so-called pre-determined criteria, to the data sets.

 

These algorithms are supposed to identify “unknown” suspects by, for instance, extrapolating patterns and conclusions from an individual’s flight behavior and other data to ascertain whether they are a suspected or potential offender.

 

A person for whom such a “hit” occurs can be the target of further clandestine police measures or be interrogated at the airport or even, depending on the national law, detained.

 

It is widely disputed whether the analysis of such data is an effective means to investigate or prevent terrorism and other serious crime

It is widely disputed whether the analysis of such data is an effective means to investigate or prevent terrorism and other serious crime. Rather, the sheer volume can make analysis more difficult.

 

In Germany, the Federal Government itself assumes a success rate of only 0.1%, meaning that 99.9 % of all air passengers – about 169,830,000 people, according to the Government’s forecasts – will be unnecessarily subjected to the processing of sensitive data. And this does not even account for the “false positives” within the 0.1 % (i.e. those who have been flagged by the system in error as a person of interest).

 

Violation of the Charter of Fundamental Rights

 

In view of the large volume of air traffic today, the processing of passenger data is tantamount to mass surveillance.

 

It violates the Charter of Fundamental Rights as it disregards both the right to respect for private and family life (Article 7), as well as the right to the protection of personal data (Article 8). The CJEU’s opinion on the EU-Canada PNR agreement clearly confirms this assessment, as does the opinion of the European Data Protection Supervisor.

 

In view of the large volume of air traffic today, the processing of passenger data is tantamount to mass surveillance

While we are certain to have the stronger legal arguments on our side, we depended on a national court to refer a case to the CJEU and ask it for a preliminary ruling on the validity of the PNR Directive under EU law. We therefore started lawsuits in both Germany and Austria that tackle the respective national law implementing the Directive.

 

In Germany, GFF took a two-track legal approach. On the one hand, we took administrative action against the Federal Criminal Police Office, which processes passenger data in Germany, and demanded that they delete our plaintiffs’ data. On the other hand, we filed civil lawsuits against the airlines transmitting the data records.

 

Towards the EU’s Highest Court

 

In January 2020, the Local Court of Cologne, Germany, submitted to the CJEU the question of whether the PNR Directive violates fundamental rights. Thus we have reached a major milestone in our strategic litigation.

 

We now hope that the CJEU will, consistent with its own jurisprudence, declare the Directive void and thereby invalidate the legal basis of all national PNR laws.

 

A judgment is expected to be rendered no sooner than by the end of 2021. This would be a huge success for fundamental human rights, as it would set boundaries for mass surveillance of all our movements (extending the PNR directive to trains, buses and ferries is already being discussed) and, maybe even more importantly, to the use of algorithms when assessing if an otherwise unsuspicious person is to be considered dangerous and to be treated as such (more often than not, wrongfully so).

 

It would also strengthen European civil society by showing that we can use cross-border strategic litigation as a means to improving the legal protections of hundreds of millions of people at the same time.

Bijan Moini is a litigator with Gesellschaft für Freiheitsrechte and in charge of its digital rights cases.
Image by Toby Yang on Unsplash

Protecting Freedom of Thought in the Digital Age

By Susie Alegre, 23rd March 2020

Emoticons and "like" buttons

My work as an independent lawyer focuses on the potential for technology to interfere with our right to freedom of thought, as protected in international law instruments including Article 9 of the ECHR and Article 18 of the ICCPR.

According to these instruments, many rights, like the right to private life and the right to freedom of expression, can be limited in certain circumstances to protect the rights of others or in the interests of public order.

The right to freedom of thought, on the other hand, is absolute under international human rights law. This means that where there has been an interference with our right to freedom of thought, as it relates to the thoughts and feelings we experience in our inner world, such an interference can never be justified. 

Despite this strong level of treaty-based protection, however, the right to freedom of thought has rarely been invoked in the courts. Many legal scholars and commentators have assumed that this is because, in fact, no person or government could ever get inside our minds

Many legal scholars and commentators have assumed that this is because, in fact, no person or government could ever get inside our minds

But when I first read about the Cambridge Analytica scandal and the use of behavioural micro-targeting to produce individual psychological profiling of voters so that they can be targeted with tailored adverts that press their unique psychological buttons, it seemed clear to me that the idea that no one could ever interfere with our minds was outdated.

There are three main planks to the right to freedom of thought:

  • The right not to reveal our thoughts.
  • The right not to have our thoughts manipulated.
  • The right not to be punished for our thoughts.

All three are potentially relevant in the digital age, where algorithmic processing of Big Data is relied on to profile and understand individual’s thought processes in real time for the purpose of targeting them with tailored advertising and other content.

Profiling seeks to infer how we think and feel in real time based on large swathes of data, including our online activity. Research on Facebook claimed that the social media platform could know you better than your family from interpreting your “likes”. In this way, interpretation of your social media activity gives a unique insight into your mind.

In another experiment, researchers showed that altering the order of Facebook feeds could manipulate users’ mood. The tailored way that information is delivered to us could change the way we think and feel in a very real way.

In another experiment, researchers showed that altering the order of Facebook feeds could manipulate users’ mood

But it is not only the tracking and manipulation of our thoughts and feelings that is of concern. The way that information could be used against us is equally worrying. Inferences about our personalities and moods drawn from big data can form the basis for decisions that will fundamentally change our life chances whether in limiting access to financial services, automated hiring processes or risk assessments in the criminal justice system.

Thanks to a DFF pre-litigation research grant, I am currently exploring the ways in which technology and artificial intelligence can be used to try to cross the boundaries into our inner world and to identify where arguments based on the right to freedom of thought could help to challenge these practices before the courts in the UK, Ireland and Spain.

The research is currently in the first phase, gathering relevant reports and examples of practices that could be considered to have implications for freedom of thought. I would very much welcome suggestions for reading and contacts from colleagues working in digital rights in any jurisdiction who would be interested in discussing ways that freedom of thought could be relevant to the work they are doing.

Inferences about our personalities and moods drawn from big data can form the basis for decisions that will fundamentally change our life chances

DFF’s Strategy Meeting 2020 offered a unique opportunity to share my initial research on the right to freedom of thought and to get feedback and ideas from an incomparable range of digital rights activists and experts from across Europe and beyond. The chance to test and discuss ideas, and to gain insights into the work of others, opened new and exciting avenues of inquiry that will feed into my research. It also helped me to reach out to new partners. I left Berlin energised and with a long list of people, organisations and topics to follow up with – DFF’s Strategy Meeting really does make things happen.

But the Strategy Meeting was just the beginning – I would love to hear from people and organisations who are already working on these issues or would like to cooperate on future work to keep the momentum from Berlin going.

Susie Alegre is an international human rights lawyer, author and barrister at Doughty Street Chambers specialising in the right to freedom of thought and technology.

Image by George Pagan III on Unsplash

How NGOs are Joining Forces Against Adtech

By Gro Mette Moen, 19th March 2020

Pole with sticker saying 'Big Data is Watching You'

As the General Data Protection Regulation approaches its two-year anniversary, it has been disappointing to see a lack of strong enforcement against adtech privacy violations.

While it seems that data protection authorities have found many violations too big to handle, NGOs have gathered forces, among other things at DFF’s strategy meeting in February.

In January 2020, the Norwegian Consumer Council published a report called “Out of Control – How consumers are exploited by the online advertising industry”, covering how apps on our phones share personal data with potentially hundreds of obscure companies. These companies harvest data to create profiles on individuals, which are then used for serving targeted advertising. Additionally, they are used for purposes such as discrimination, manipulation, and exploitation.

These companies harvest data to create profiles on individuals, which are then used for serving targeted advertising

Alongside the report, we filed complaints against six companies for breaching the GDPR, including five large adtech companies.

However, we were far from alone. We published ‘Out Of Control’ all across the world, together with a large number of other consumer, digital rights, and civil rights groups in Europe and the United States. The consumer rights umbrella in Europe, BEUC, coordinated the European action and many of the organisations are members of the Transatlantic Consumer Dialogue (TACD), Consumers International and the human rights umbrella Liberties. A total of 43 organisations in 20 countries participated, sending letters to several data protection authorities. Together we have achieved media coverage in more than 70 countries.

Our research was inspired by a vast digital rights network, including previous work done by Privacy InternationalPanoptykon Foundation and Open Rights Group. Additionally, many organisations have given important input in the research and we worked together with the cybersecurity company Mnemonic, the digital rights organization None Of Your Business (NOYB) and the researcher Wolfie Christl of Cracked Labs in particular.

Based on discussions at events such as the DFF strategy meeting in February, it is clear that many of the issues reach beyond privacy violations. In the future, we need to push for enforcement, looking across silos to address the combination of competition, privacy and consumer rights violations.

…it is clear that many of the issues reach beyond privacy violations. In the future, we need to push for enforcement

Another important topic is how we can address the responsibilities of publishers and advertisers. Publishers, advertisers and consumers/citizens are all cheated by the adtech industry in different ways. This means that we might have common interests.

Karolina Iwańska from the Panoptykon Foundation describes the dilemmas facing legitimate media in the article “10 reasons why advertising is broken”. Since a large portion of advertising funds go to adtech intermediaries, and because the adtech industry is dominated by only a few monopolistic actors, legitimate media are locked into a less-than fruitful business relationship with companies such as Google.

There are also widespread issues with ad fraud, where advertisers are paying large sums of money to adtech companies, but end up having their ads shown to bots instead of human beings.

We have seen some advertisers address the problematic aspects of adtech, both in dialogue with NGOs and in industry forums. For example, the head of the Norwegian branch of the World Federation of Advertisers recently published an opinion piece in the industry magazine Kampanje, predicting that advertisers will demand higher privacy standards from the adtech industry in the future. He calls for advertisers to become “the change you wish to see” – using their purchasing power to create room for alternative business models to grow. In other words – these issues can be addressed both through enforcement and from industry pressure.

The adtech industry should look out, because when the NGO sector stands together, we are a force to be reckoned with

In the meantime, we hope that further collaborative work will help push relevant authorities to protect consumers from commercial surveillance. By extension, this may help bring about a better and safer digital world, a world that is no longer driven by all-consuming tracking and profiling.

The adtech industry should look out, because when the NGO sector stands together, we are a force to be reckoned with.

Gro Mette Moen is the acting director of digital policy in the Norwegian Consumer Council.

Photo by ev on Unsplash