Protecting Children’s Digital Rights in Schools

Protecting Children’s Digital Rights in Schools

By Nani Jansen Reventlow, 30th July 2020

Children sitting in a classroom

This post was written in association with The State of Data 2020, an event organised by defenddigitalme, who advocate for children’s privacy in data and digital rights. 

In recent years, cases have been decided across Europe that raise serious issues about the digital rights of children in schools.

Unsurprisingly, schools are not immune to the increased digitisation sweeping through our societies. Around Europe, many schools have already begun to introduce tools that collect and process students’ biometrics, such as their faces and fingerprints, as well as other personal, sensitive data.

It sounds worrying – and it is. Luckily, however, children’s personal data has special protection under the GDPR.

Processing children’s personal data in the education context isn’t entirely off limits under this regulation. It’s acceptable for schools to use such data if there is deemed a good reason for it: for example, if it’s in the interest of the child or the public. This could include monitoring immunisations, keeping attendance, or developing online learning platforms that further the education of the child.

But there are, of course, caveats. For one, the GDPR sets higher standards of justification when it comes to processing people’s unique, biometric data, given that it is an extremely sensitive and invasive form of data collection. Also, if schools don’t ensure their students’ data are completely secure, or if they collect more data than they actually need, they can end up breaching GDPR.

As we’ll see below, real life examples of such breaches have already begun to surface.

Taking Attendance through Facial Recognition

First we’ll go to Sweden, where the Secondary Education Board in the municipality of Skellefteå began using facial recognition in 2018 to monitor attendance at a secondary school.

In this case, 22 students were monitored by cameras as they entered the classroom every day. The purpose of this experiment was to see if automation could save time in registering students’ attendance. The school hoped that the time saved could then be used for teaching. The students’ facial images, as well as their first and last names, were processed and stored, with consent from their guardians.

The students’ facial images, as well as their first and last names, were processed and stored, with consent from their guardians.

Similarly, in France in 2019, a regional authority launched an experimental security project in two high schools –– one in Marseille and one in Nice –– installing facial recognition gates at the entrance of the schools. The gates identified students using facial images, and also scanned for unidentified visitors. The students whose data would be processed had given their consent.

The outcome? Both the Swedish and French Data Protection Authorities found that using facial recognition technology to monitor students breached the GDPR, specifically Article 5 and Article 9.

Under Article 5, personal data processing must be “adequate, relevant, and limited” to what is necessary to carry out a specific purpose.

Article 9, on the other hand, makes clear that processing biometric data is permissible only under strict conditions. One of those conditions is “explicit consent” – something that both schools used to justify their experiments. However, due to the imbalance of power between the data subject –– the students –– and the data controller –– the school –– both data protection authorities rejected the fact that consent was given freely.

However, due to the imbalance of power between the the students and the school, both data protection authorities rejected the fact that consent was given freely

As well as this, both authorities found that facial recognition technology infringed upon the personal integrity and data protection rights of students to a degree that was disproportionate to the purposes being pursued.

The Swedish data protection authority actually went further, and found that, given how little we still know about the potential risks and consequences of deploying facial recognition, the school should have consulted with the authority for advice before using it.

Fingerprint Scanning for Student Meals

Another key case occurred in Poland, and concerned not facial recognition, but fingerprint collection.

Starting in 2015, Primary School No. 2 in Gdansk processed the fingerprints of 680 children as a method of verifying meal payment at the school canteen.

Once again, the school had obtained consent from the guardians of students whose biometric data was being processed. Four children opted out, and paid using an alternative method. However, these students had to wait at the back of the lunch queue until all the students using biometric data verification had passed through the canteen.

Just as the Swedish and French schools had done, the Polish school argued that it had obtained “explicit consent.” But, once again, the Polish data protection authority rejected this point. This time, the authority found that, because students who didn’t consent had to wait at the back of the line, they were treating students who refused to consent unequally – thereby placing pressure on students to consent.

The authority found that, because students who didn’t consent had to wait at the back of the line, they were treating students who refused to consent unequally

As well as this, the authority found, once again, that biometric data collection wasn’t necessary in this case, given that there were alternative, less invasive methods for students to verify their meal payments.

Unsecure Apps for Parents, Students and Teachers

Moving onto Norway, the Education Agency of the Municipality of Oslo launched an app in 2019 that allowed parents and students to send messages to school staff.

Soon after the app launched, Aftenposten, a widely read Norwegian news outlet, broke the news that the app contained a major security vulnerability. Apparently anyone who could log in to the portal could potentially gain access to every communication that had been sent through the app – as well as the personal information of 63,000 school children in Oslo.

Anyone who could log in to the portal could potentially gain access to every communication that had been sent through the app – as well as the personal information of 63,000 school children

The Norwegian data protection authority found that the Education Agency had breached the GDPR – Article 5.1(f) – by launching an app without properly testing it to ensure the information on it would be stored securely.

The authority also found that the Education Agency breached Article 32 of the GDPR, which requires data controllers to ensure a level of security appropriate to the risk. In light of the special vulnerability of children, the Education Agency did not take sufficient measures to ensure the confidentiality and integrity of its messaging system.

As a result of the findings, the data protection authority imposed a fine on the Education Agency. It also imposed a similar fine on the Municipality of Bergen, where data security insufficiencies exposed the personal data of students by allowing unauthorised users to access the school’s administrative system, as well as the digital learning platform where students’ classwork, evaluations, and personal data were stored.

Recurring Patterns

In considering whether a school violated the data protection rights of students, the data protection authorities of Sweden, France, Poland, and Norway all drew attention to the relationship between a school and its students. Schools are in a position of authority when they monitor students’ data, and students are in a position of dependency. The power imbalance in this relationship will most often invalidate consent as a lawful condition for biometric data processing.

As well as this, the particularly sensitive and vulnerable nature of children’s data processed in a school context –– often relating to issues such as health, identity and development – requires schools to protect students’ personal data with particularly secure technical measures.

Educational authorities carry out many tasks that require collecting personal data from students. Technologies, including biometric technologies like facial recognition and fingerprinting, present new opportunities for schools to expedite processes like attendance monitoring, school security, and lunch payment.

But the recent GDPR fines imposed on schools for implementing such programs underscore that schools cannot sacrifice the privacy of its students for expediency

But the recent GDPR fines imposed on schools for implementing such programs underscore that schools cannot sacrifice the privacy of its students for expediency.

Whatever measures schools adopt, they must safeguard children’s rights and be necessary and proportionate to the purpose pursued. If there are alternative measures that do not require the processing of sensitive personal data, and those measures can achieve the same results, then the processing of sensitive personal data is not likely to be justified under the GDPR.