Connecting digital rights litigators: A transatlantic call series

By Jason Williams-Quarry, 18th December 2019

How can we minimise the negative impact of the use of algorithms and machine learning on our human rights? Where do we see goals align for tackling the threat of facial recognition technology? How can I get in touch with lawyers in my jurisdiction working on Freedom of Expression online? How can we better understand when and how states create laws to bypass traditional encryption measures and how they are enforcing them?

 

These are some of questions that were explored during our 2019 transatlantic call series. This series of five calls between Europe and the US brought together lawyers, academics, researchers, and digital rights activists to share information, best practices, and lessons learned and to explore potential avenues for transatlantic collaboration.

 

The idea for the call series arose from conversations earlier this year between DFF and US lawyers from Electronic Frontier Foundation, Knight First Amendment Institute at Columbia University and the American Civil Liberties Union. In the course of these conversations, five thematic issues were identified that helped shape the focus of each meeting in the series.

 

In September, we kicked off our first call in the series with a round of introductions, to give all participants an opportunity to get to know each other’s work better and discuss the respective digital rights priorities on either side of the Atlantic. The second call focused on heat mapping thematic areas for transatlantic collaboration, and participants explored joint work on, amongst other issues, the privacy shield, facial recognition and the datafication of migrant and refugees’ data.

 

In break-out room discussions during the third call, participants looked at challenging anti-encryption measures in different jurisdictions; using the First and Fifth Amendment arguments in the US and fundamental rights to privacy in Europe. We also learnt how government bodies have tried to gain access to individuals’ phone data in Europe, and how organisations on either side of the Atlantic collaborated in bringing some of these cases to court.

 

During the fourth call, we learnt about the use of black-box algorithms in the criminal justice and social security systems, and the negative impacts these algorithms can have on individual human rights, especially vulnerable groups. The final call in the series focused on challenges to content moderation and social media blocking, including how public officials attempt to block dissenting voices on their social media accounts. Participants shared knowledge on the strategies used to bring cases to trial and garner public attention to infringements on human rights. We also heard about success stories in transatlantic collaboration on the right to be forgotten.

 

What we have appreciated most from these calls is how the participant-led discussions during each meeting helped build the foundation for the next; at the end of each session, participants shared lessons learned and made plans for follow-up and collaboration with one another.

 

We look forward to seeing a follow-up to this call series and are exploring ways in which we can organise an in-person meeting to do more in depth work on issues of mutual interest. If you have ideas on thematic focus for potential future activities, please let us know!

Transatlantic call series: uniting digital rights litigators to challenge anti-encryption measures

By Jason Williams-Quarry, 11th October 2019

This week, we held our third transatlantic
call
, hosting a conversation between digital rights litigators in Europe and
the US. Following a conversation about potential areas for transatlantic
collaboration last week, this week’s call focused on encryption.

The American Civil
Liberties Union
(ACLU) and Electronic Frontier Foundation (EFF) told participants about their work challenging
anti-encryption measures. These organisations frequently collaborate to challenge
anti-encryption, seeking protection of the rights of individuals (and their
devices) under the Fifth Amendment. This protects people under investigation
from being forced to open their devices to police or courts, as doing so may
violate their right against self-incrimination.

EFF also shared experiences of their work in the Apple v. FBI case that concerned a judgement requiring Apple to engineer a ‘backdoor’ into its iPhone software operating system. This would create a security flaw leaving iPhone users vulnerable to hack. In this case, EFF filed an amicus brief to the court arguing that Government demands were violating Apple’s First Amendment rights. An interesting discussion point came from reflections that, when it comes to challenging anti-encryption measures, sometimes maintaining the status quo can be considered a win. Under this theory, the longer that anti-encryption measures can be held back, the more encryption will become the ‘norm’.

We also heard how US and European organisations have
collaborated in the past, commonly through interventions by US organisations in
European cases. EFF and the ACLU called for more European organisations to
reach out about anti-encryption cases, so that cross-Atlantic collaboration can
continue to grow in the future. Other topics touched upon during the
conversation were attempts from the police in the UK to gain
access to the phones of sexual abuse victims
and
ongoing work in Russia, where the Government
are trying to force Telegram to provide access to users’ keys

At the end of the session, participants shared lessons
learnt and made plans for follow-up and collaboration. This further built on
some of the work done during the preceding transatlantic call, where participants
focused on ‘heat mapping thematic areas for transnational collaboration’. This
demonstrated an interest in exploring joint work on, amongst others, the
privacy shield, facial recognition and the datafication of migrant and
refugees’ data.

What we continue to appreciate from this transatlantic
call series, is how the participant-led discussions during each call build the
foundation for the next. We look forward to seeing this process continue and
are exploring ways in which we can organise an in-person meeting to do more in depth
work on issues of mutual interest.

Our next and penultimate call in the series will take
place on Tuesday 15 October, focusing on ‘machine learning and human rights’.  If you have not yet registered to join this call
and you would like to, please
let us know
, so we can add you to the list!

Facial Recognition Virtual Design Jam: brainstorming litigation to challenge use of facial recognition software

By Jason Williams-Quarry, 17th September 2019

What are the research or evidence gaps when it comes to the legal implications of facial recognition technology? What research or support is needed in the short term to litigate against the use of such technology by law enforcement? Which jurisdictions are most cost effective for facial recognition litigation? What can law clinics do over the coming year to assist in these kinds of cases?

These questions were considered during DFF’s virtual design jam on facial recognition this week. The design jam continued from and built on in-group discussions that arose during DFF’s workshop on ‘connecting the field with academia’ earlier this year. During the workshop, participants were eager to explore connections between the field and academia on a number of subjects andsome of these conversations gravitated towards facial recognition. The virtual design jam was an opportunity to take a further look at the questions posed above and to identify potential lines of collaboration and research.

Recent events in the news have drawn attention to the use of facial recognition technology by both public and private actors. A recent case brought by Liberty sought to challenge the use of facial recognition cameras by the South Wales Police in the UK. In Sweden, the data protection authority (Datainspektionen) found a school’s use of facial recognition technology to register student presence violated various aspects of the General Data Protection Regulation. Outside of Europe, pro-democracy protestors in Hong Kong have been recorded tearing down facial recognition towers, in resistance to state surveillance.

During the jam, breakout groups mapped out the research and evidence that could be collected in the short and medium term to help build and develop strategic human rights litigation challenging the use of facial recognition technology. One room looked at the research and evidence that could assist litigation efforts to challenge the use of such technology by law enforcement. Another room examined what work could be done to help identify the ideal regulatory environments for taking cases against public and private actors’ use of such technology. Finally, another room considered more specifically what the role of law clinics might be in providing research or legal support for facial recognition cases.

At the end of the session, the breakout rooms reported out a number of specific research requests that could be taken on by academics, researchers or law clinics. For instance, one room sought more research on the “chilling effect” facial recognition technology has on rights, such as the rights to freedom of association and freedom of expression. Another room identified the need for a comprehensive mapping of how judiciaries and legislators in different European jurisdictions regulate audio-visual surveillance, CCTV, and facial recognition technologies.

DFF will share these research requests with its network, including academics, researchers and law clinicians who joined the workshop on ‘connecting the field with academia’ earlier this year. We also welcome hearing from other academics or researchers who are willing to assist and collaborate with digital rights litigators on this important work. If that sounds like you, please do get in touch!

We hope that this design jam will be the starting point for collaborative research projects that can later inform and strengthen litigation challenging facial recognition software.