Articulating litigation goals for challenging the collection of biometric data
By Alan Dahi, 24th October 2019
There were many productive sessions at the two-day meeting in Berlin on unlocking the strategic litigation opportunities of the GDPR, hosted by DFF in September 2019. One of these was on articulating litigation goals to challenge the collection of biometric data.
The GDPR defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data” [emphasis added]. The GDPR sees biometric data as a special category of personal data, which enjoys particularly strict safeguards when it comes to its use.
At first glance, it is perhaps surprising that only such personal data “resulting from” certain technical processing is covered by the definition and not the underlying personal data itself. However, this approach can be understood against the background that photographs would otherwise qualify as biometric data. The special safeguards put in place by the GDPR against the unlawful use of “special categories of personal data”, such as biometric data, would undermine the societally established – and typically accepted – widespread use of photographs.
While acknowledging that the GDPR permits the processing of biometric data, the session in September considered which uses of biometric data should be acceptable from a societal or moral perspective, independently from the current legislative reality. The opinions ranged from its use never being acceptable (at least until the societal ramifications are better understood and placed on a solid legal basis) to a more differentiated approach depending on the type of biometric data, the intended purpose, and the involved parties.
Ultimately, and in light of the current legislative framework that permits the limited use of biometric data, the group focused on evaluating the potential privacy harms the use of different types of biometric data may have across various scenarios.
The session came to the conclusion that biometric data based on facial recognition deserves a particular focus. This is because faces are, typically, readily visible; their collection for biometric purposes is very easy. Indeed, the collection of facial features can typically be done without the affected individual’s knowledge or awareness. Moreover, there are few practical ways for an individual to prevent such collection. This would be in contrast to the more physically intrusive collection of fingerprints, which generally requires the co-operation of the individual, and against which an individual would be in a better position to protect themselves or challenge the data processing.
Considering different scenarios surrounding the use of biometric data, the overall consensus was that the forced provision/collection of biometric data is generally unacceptable, particularly with regard to access to government and private services, as well as when it comes to the biometric data of children.
The session, which consisted of participants from five different organisations, ended with a clearer understanding of how to articulate litigation goals to challenge the collection of biometric data and with a practical road map on how to put these goals into action.
About the author: Alan Dahi is a data protection lawyer at noyb.