The Need for Digital Rights Protections in Upcoming EU Competition Law Consultations

By Paul Keller, 25th June 2020

Simple computer chip

On 2 June 2020, the European Commission launched two public consultations to seek views on the Digital Services Act (DSA) package and on a New Competition Tool (deadline for responses is 8 September 2020).

Together with these consultations, it also presented three “Inception Impact Assessments” related to these two legislative projects and gave interested parties until the end of June to provide feedback on these.

These Inception Impact Assessments are proposals for analytical frameworks that the European Commission will use to assess proposed policy instruments. The first one deals with the e-Commerce aspects of the proposed DSA, the second deals with the plans for an “ex ante regulatory instrument for large online platforms acting as gatekeepers,” and the third one deals with the proposal for a “new competition tool.

The possible changes that the DSA may bring to core principles of the EU e-commerce framework, particularly with regard to intermediary liability, have been getting a lot of attention recently and seem to dominate the discourse within the digital rights community.

So far, however, there has been comparably less attention by human rights organisations for the policy initiatives aimed at addressing the disproportionate influence large digital platforms have on the rights of participants in the digital economy.

This is why, following this month’s competition law workshop organised by DFF, a small group of participants from the workshop (the group includes representatives from Access Now, Article 19, Privacy International, COMMUNIA, Liberties and Worker Info Exchange) have teamed up to draft an initial response to the Commission’s Inception Impact Assessments to the New Competition Tool and the ex ante regulatory instrument for large online platforms.

Both of these tools/instruments are expressions of the Commission’s objective to increase its regulatory capacity in the online environment.

The new competition tool that is being considered  by DG COMP would give the Commission additional abilities to intervene in situations that fall outside the scope of its current regulatory powers derived from Article 101 TFEU (anti-competitive agreements and practices between companies) and 102 TFEU (abuse of dominant position) or its ability to review proposed mergers. Regulatory powers that, as discussed at the DFF workshop, have room for improvement when it comes to holding dominant platforms to account for harmful business practices.

As outlined in the Inception Impact Assessment, the “new tool” would allow the Commission to take action in response to structural market concerns. While the Inception Impact Assessment keeps all options open, it is clear that the Commission is primarily thinking about structural problems in “digital or digitally enabled markets.”

Given this, there is significant overlap with the other policy initiative of ex ante regulation. The proposed “ex ante regulatory instrument for large online platforms with significant network effects acting as gate-keepers in the European Union’s internal market,” developed by DG CNECT, is squarely aimed at large online intermediaries. With this instrument, the Commission intends to “ensure a fair trading environment and increase the innovation potential and capacity across the online platform ecosystems in the EU’s single market.”

In our analysis, the fact that the Commission is seeking to strengthen its regulatory capacity in the online environment is very welcome. We are however concerned that both initiatives are largely limited to discussing the application of these instruments based on market-focused objectives. This is why it urges the Commission to take a broader perspective when analysing the impact of the proposed tools. In addition to the objective of improving the economic efficiency of markets, the Commission should also measure the impact of these tools on protecting fundamental rights, such as the right to self-determination in the digital environment. In our response, we state that:

We believe that this small number of large online platforms not only act as economic gatekeepers, but also as ‘fundamental rights’ gatekeepers. Through their business models, their terms of services and community guidelines, these platforms set standards in the market with regards to, among others, people’s rights to privacy, data protection and freedom of expression. These large platforms are able to do so because, on the one hand, barriers to entry are so high that it is extremely difficult, if not impossible, for new players to enter the market and put competitive pressure on gatekeepers. And, on the other hand, because consumers do not have viable alternatives to switch to. […]

The impact of these platforms’ behaviours and business models on the guarantee of fundamental rights in the digital single market is a major challenge for the EU, and the European Commission should include it in its understanding of the problem it aims to fix with these welcomed initiatives.

You can find the full draft response here. We are currently looking for additional signatories for our response. If you are representing an organisation interested in supporting our response please get in touch with us by 28 June 2020.

We believe that it is essential that the digital rights community sends a strong signal to the European Commission that its responsibility in the digital space goes further than ensuring the functioning of markets. As we state in our response:

We are convinced that the initiatives of the European Commission constitute a once in a generation opportunity and the planned reform could become a blueprint for the regulation of digital markets and services worldwide. Therefore, we hope the Commission will give due attention to our calls and will not miss the opportunity to set the rules for a democratic, fair, innovative and fundamental rights’-oriented EU digital society.

Paul Keller is the President of COMMUNIA association for the public domain and senior research fellow at the Institute for Information Law, University of Amsterdam.

Image by xresch from Pixabay

The Grave and Growing Dangers of Border Surveillance

By Dragana Kaurin, 20th June 2020

CCTV camera perched ominously along barbed wire fence

This year, World Refugee Day comes during a devastating global pandemic that has closed borders around the world.

The current reality makes it even more difficult for refugees to flee to safety and seek asylum. We are also witnessing protests and community action against racial structural violence worldwide, bringing to the surface a number of disturbing trends in technology and policies in border surveillance.

One of the successes in this regard that has been widely reported in the media lately, due in no small part to years of civil society campaigns, is a moratorium on the sale of facial recognition technology to law enforcement in the United States. Research has shown a widespread racial and gender bias in these intrusive technologies, and the moratorium is a welcome move towards stopping discriminatory practices and abuse of data by law enforcement.

However, they are still being used to monitor the movements of refugees and migrants who don’t have the same protections and access to legal mechanisms as citizens.

Almost immediately, the COVID-19 pandemic became a justification for governments to use more intrusive surveillance technologies on refugees and migrants

Almost immediately, the COVID-19 pandemic became a justification for governments to use more intrusive surveillance technologies on refugees and migrants, in some cases forcing them to wear wristbands that track location and movement. Biometrics have become commonly used in the aid sector for both registration of refugees and access to aid and services, among UN agencies, NGOs, border control, and private sector contractors.

The widespread application of biometrics is justified by UN agencies like the World Food Program (WFP) as a solution to the very non-existent problem of identity theft among aid recipients, and embezzlement of aid, which is largely done by aid providers, not recipients. 

Such a large-scale reliance on these technologies has reduced human bodies into evidence. As a result, and in an effort to regain some control over their bodies and their data, there have been a number of disturbing reports about refugees burning off their own fingerprints out of fear of being tracked and returned to countries of origin, or to entry-point countries in the EU due to the Dublin Regulation.

…there have been a number of disturbing reports about refugees burning off their own fingerprints

Like facial recognition technology, fingerprint readers were built on racialised assumptions of users, and only have high accuracy with middle aged men with lighter skin tones. There has been a pattern of centralised collection and storage of this biometric and personal data across sectors and agencies, seen with public-private partnerships like Palantir-WFP, Microsoft-International Committee for the Red Cross, and Facebook Libra-Mercy Corps.

Even though these organisations claim the data sharing partnerships are making their operations easier, it comes at the cost of losing the trust and confidence from the refugee communities. 

Lastly, the target of migration surveillance has shifted to the people crossing into bordering countries, and to monitoring the human rights organisations who work with refugees. As facial recognition technology and other biometric tracking has become normalised through public-private partnerships, we’re seeing a bigger trend in cross-border surveillance of migrants and refugees.

Outsourcing surveillance of migrants has become the norm

Outsourcing surveillance of migrants has become the norm in Australia, United States, and the EU, done through data extraction companies like Cellebrite, which bypass passwords to track location and movement through digital devices. Furthermore, agencies like EUROSUR that are tasked with monitoring borders have shifted their focus to surveillance of civil society organisations that conduct search and rescue at sea, as well as journalists and human rights researchers that document abuse and harassment of asylum seekers by border control agents in the US.

These trends in humanitarian “technosolutionism”, cross-border surveillance of migrants and refugees, and use of their data without consent are deeply troubling, and evident across sectors: at UN agencies, border control agencies, and non-government organisations.

With no one to turn to for protection and no way to access their own data, these stakeholders seem to be sending the message to refugees that they are too untrustworthy to get information from them directly and with informed consent.

It’s therefore critical that in this political moment of reimagining and restructuring our systems, we include the needs and the rights of refugees, because the overreach of intrusive technologies always starts with the most marginalised, invisible communities before it is normalised and eventually used widely on others.

Dragana Kaurin is a human rights researcher and Executive Director of Localization Lab.

Fighting COVID-19 Digital Rights Violations: Our New Litigation Fund

By Thomas Vink, 8th June 2020

A graphic showing a smartphone location and a virus.

On 8 June 2020, DFF launched the COVID-19 Litigation Fund. The fund supports rapid response strategic litigation that challenges digital rights violations committed in the context of the COVID-19 pandemic.

We started this fund because measures brought in by politicians, authorities and businesses in response to the pandemic will have ramifications for digital rights for years to come.

Some of these measures have a detrimental impact on our human rights in the digital sphere: our right to privacy can be violated by invasive “Corona apps”, our right to access information is hampered when the free press is limited or even silenced, and the use of AI to help allocate health resources could lead to discrimination and unequal access to essential services.

Strategic litigation, alongside advocacy and other efforts, has a major role to play in challenging the most egregious measures. The fund enables activists and litigators to start bringing legal challenges now to halt or limit the impact of digital right-infringing measures.

COVID-19 – A Crisis for Digital Rights

Back in April, we called the COVID-19 pandemic a crisis for digital rights. This crisis shows no signs of abating.

Back in April, we called the COVID-19 pandemic a crisis for digital rights. This crisis shows no signs of abating.

The rapid nature of the pandemic response has empowered governments to rush through policies and emergency measures with little to no legal oversight.

Many states have used the pandemic as an excuse to censor critics and filter information online in their favour. Countries are rolling out contact-tracing apps and biometric technology to track citizens’ movements, communications and health data in relation to the pandemic. And as governments reduce lockdown restrictions, new risks are emerging with the introduction of measures that increase inequalities related to freedom of movement, access to public spaces, and the ability to work and access essential services.

There are already examples of strategic litigation being used to halt digital rights violations taking place during the pandemic. In Israel, human rights organisation Adalah successfully challenged the tracking of cell phones by the Israel secret service. In France, judges banned the use of surveillance drones to monitor public compliance with coronavirus-related restrictions after La Quadrature Du Net and La Ligue des Droits de l’Homme filed a lawsuit against the Parisian police. In the UK, Open Rights Group are preparing a legal challenge to the National Health Service’s coronavirus test-and-trace programme.

In Israel, human rights organisation Adalah successfully challenged the tracking of cell phones by the Israel secret service

But these cases are just the tip of the iceberg. Activists and litigators need more resources to take litigation to mitigate the negative consequences of the human rights violations occurring during the COVID-19 pandemic. We set up the COVID-19 Litigation Fund for this purpose.

What is the COVID-19 Litigation Fund?

DFF has been working with digital rights litigators and activists across Europe since 2017, and providing grants to support strategic litigation to advance digital rights since 2018. We fund cases that demonstrate potential to bring about legislative, policy or social change, and to have an impact extending beyond the parties directly involved in the case.

The digital rights violations occurring under the COVID-19 pandemic are unprecedented and need to be challenged as a matter of urgency. Thanks to support from the Open Society Foundations, we were able to create a dedicated fund to support activists and litigators to take rapid response strategic cases related to the COVID-19 pandemic.

The first call for applications under the COVID-19 Litigation Fund will close on 28 June, and grants will be contracted with successful applicants in late August 2020. Grants will support litigation through the whole process, from first instance to the final appeal stage.

Funding is not limited to digital rights organisations. We also encourage applications from other organisations

Recognising that there is a “digital divide” that limits access to justice and increases hardship for some people, DFF will prioritise applications that focus on addressing the negative impact felt by the most vulnerable groups in society. Funding is not limited to digital rights organisations. We also encourage applications from other organisations, where digital rights violations have occurred in the context of other work, such as health, social justice or poverty.

Applying for a Grant

The call for applications under the COVID-19 Litigation Fund is open from 8-28 June 2020. If you have a case challenging digital rights violations related the COVID-19 pandemic we encourage you to apply.

The call for applications under the COVID-19 Litigation Fund is open from 8-28 June 2020

Of course, you may have identified an issue but you are not yet ready to litigate. We are working on mobilising resources to allow for a second call for applications later in 2020, but are unable to confirm that at this time. So continue to watch this space. And: if you have an idea for a case or for litigation not related to COVID-19, please head over to our main grants page and apply for a grant now!