Facial Recognition Virtual Design Jam: brainstorming litigation to challenge use of facial recognition software
What are the research or evidence gaps when it comes to the legal implications of facial recognition technology? What research or support is needed in the short term to litigate against the use of such technology by law enforcement? Which jurisdictions are most cost effective for facial recognition litigation? What can law clinics do over the coming year to assist in these kinds of cases?
These questions were considered during DFF’s virtual design jam on facial recognition this week. The design jam continued from and built on in-group discussions that arose during DFF’s workshop on ‘connecting the field with academia’ earlier this year. During the workshop, participants were eager to explore connections between the field and academia on a number of subjects andsome of these conversations gravitated towards facial recognition. The virtual design jam was an opportunity to take a further look at the questions posed above and to identify potential lines of collaboration and research.
Recent events in the news have drawn attention to the use of facial recognition technology by both public and private actors. A recent case brought by Liberty sought to challenge the use of facial recognition cameras by the South Wales Police in the UK. In Sweden, the data protection authority (Datainspektionen) found a school’s use of facial recognition technology to register student presence violated various aspects of the General Data Protection Regulation. Outside of Europe, pro-democracy protestors in Hong Kong have been recorded tearing down facial recognition towers, in resistance to state surveillance.
During the jam, breakout groups mapped out the research and evidence that could be collected in the short and medium term to help build and develop strategic human rights litigation challenging the use of facial recognition technology. One room looked at the research and evidence that could assist litigation efforts to challenge the use of such technology by law enforcement. Another room examined what work could be done to help identify the ideal regulatory environments for taking cases against public and private actors’ use of such technology. Finally, another room considered more specifically what the role of law clinics might be in providing research or legal support for facial recognition cases.
At the end of the session, the breakout rooms reported out a number of specific research requests that could be taken on by academics, researchers or law clinics. For instance, one room sought more research on the “chilling effect” facial recognition technology has on rights, such as the rights to freedom of association and freedom of expression. Another room identified the need for a comprehensive mapping of how judiciaries and legislators in different European jurisdictions regulate audio-visual surveillance, CCTV, and facial recognition technologies.
DFF will share these research requests with its network, including academics, researchers and law clinicians who joined the workshop on ‘connecting the field with academia’ earlier this year. We also welcome hearing from other academics or researchers who are willing to assist and collaborate with digital rights litigators on this important work. If that sounds like you, please do get in touch!
We hope that this design jam will be the starting point for collaborative research projects that can later inform and strengthen litigation challenging facial recognition software.