Digital Rights Are Children’s Rights: World Children’s Day
On 20 November, we mark World Children’s Day and celebrate the work that children and young people have been doing to call out injustice and challenge human rights violations across the globe.
From climate change to tackling food poverty, children and young people have been at the forefront of human rights activism, campaigning and advocacy. They have even leveraged the opportunities presented by strategic litigation to push for change. Last year, for example, a ground-breaking case was taken to the European Court of Human Rights by six children and young people calling on thirty-three European governments to take urgent action on climate change.
Children and young people have lived their entire lives in the “digital age” and are often the first adopters of new technologies. These technologies can offer great opportunities for the full realisation of their rights, from the rights to education and freedom of expression to the right to equal participation in political and public affairs.
Nonetheless, digital technologies can pose distinct and serious risks of infringing these rights, and these risks are often further exacerbated by the huge disparity and inequality that exists between children in access to technologies, digital literacy, privacy, and safety. Many of these risks flow from the “datafication” and “dataveillance” that children and young people experience in their daily lives.
Many of these risks flow from the “datafication” and “dataveillance” that children and young people experience in their daily lives
For instance, children’s data has been exploited by the processing practices that underpin the surveillance capitalist business models of social media and other online platforms. Not only have these practices interfered with the private lives and data protection rights of children and young people on a global scale, they have also fuelled the optimisation and targeting of content that has caused them distress, anxiety and other psychological harms. But this is just one way in which the internet poses a risk to a young person’s safety and security.
Educational contexts have been transformed by digital technologies. Schools have demonstrated an increasing reliance on systems that process sensitive biometric data, including facial maps and fingerprints, for the purposes of monitoring attendance, processing lunch payments and heightening school security. Big tech companies are dominating EdTech (Education Technology) markets, and their products are often rolled out in environments where children’s digital rights are poorly enforced and where children have no meaningful oversight, autonomy or control over how their data is being used.
Big tech companies are dominating Education Technology markets, and their products are often rolled out in environments where children’s digital rights are poorly enforced
Furthermore, these products can often rely on statistical modelling, algorithmic profiling, and automated decision-making to allocate resources, develop curricula, “detect” exam cheating and “predict” academic performance. These unreliable and inscrutable systems can perpetuate discriminatory and exclusionary measures in a person’s formative years, such as denying them access to educational opportunities or supporting inappropriate and unjustified interventions by immigration and other public authorities.
Schools and other educational establishments have become hives for children’s data, which raises serious concerns about the necessity for this data, how it is secured, and how it is shared. With data processing taking place by both public and private entities in educational contexts, threats have emerged around the potential for children’s data to be unlawfully exploited for commercial, law enforcement, or immigration purposes.
The datafication of children also takes place outside of schools. Children and young people are exposed to the Internet of Things (IoT) in their family, home, and social life. Of particular concern is the emergence of connected toys and similar devices that are aimed at enabling play and learning but also use data-driven technologies to build detailed user profiles from an early age.
Of particular concern is the emergence of connected toys and similar devices that are aimed at enabling play and learning but also use data-driven technologies to build detailed user profiles
Our increasingly connected world has also facilitated a parental panopticon infrastructure, with apps offering a range of tools from geofencing, speed monitoring and remote activation of phone microphones to help parents spy on their children. Some even offer a “stealth mode” that means the child will “never find out that their parents are tracking them.” These infrastructures and tools, even though aimed at offering parents’ peace of mind, run the risk of normalising children to a culture of constant surveillance.
These are just some of the threats posed to children’s rights in digital contexts. However, there is much work being done to push back against some of these trends.
These infrastructures and tools, even though aimed at offering parents’ peace of mind, run the risk of normalising children to a culture of constant surveillance
Last year saw students protest and threaten legal action against the use of an algorithm to grade “A-level” exams in the UK. After hundreds of students stood in front of the Department for Education chanting “F**k the algorithm,” the government retracted the algorithmically determined grades. Two years before this, in France, the National Union of Students took legal action over an algorithmic process that was used to allocate high school students and other candidates to undergraduate places.
Legal action has also been taken to protect children’s digital rights in other educational contexts. Litigation taken last year by Good law Project, during a time when many schools were closed to children due to the pandemic, put pressure on the UK government to facilitate access to education for those without access to digital devices and/or an internet connection at home. defenddigitalme continues its legal fight against the UK Department for Education in relation to the National Pupil Database, a database built on the school records of some 23 million pupils, and its sharing of school data with third parties.
This case claims that children’s data, including their phone numbers, videos, location, and biometric data, are being processed by the platform without sufficient warning, transparency, or lawful consent
Action is also being taken against the role of big tech in violating children’s digital rights. Last year, the former Children’s Commissioner for England, Anne Longfield, launched a legal action on behalf of millions of young people against TikTok. This case claims that children’s data, including their phone numbers, videos, location, and biometric data, are being processed by the platform without sufficient warning, transparency, or lawful consent.
A case has also been taken against TikTok by a group of parents in the Netherlands. Similarly, cases have been taken in a number of countries against YouTube, Google, Facebook/Meta and gaming app creators for failing to respect and protect children’s privacy. In the US, litigation had also sought to hold big tech companies Apple, Google, Tesla, Dell and Microsoft to account for the child labour that takes place in cobalt mining operations in their supply chains.
This year has also seen welcome developments on the policy front too, with the UN Committee on the Rights of the Child adopting a general comment on children’s rights in relation to the digital environment and Germany’s legislation that adopts many aspects of this general comment, including the promotion of children’s engagement and participation in digital policy.
Nonetheless, there is still much work to be done in ensuring children’s rights are fully respected, protected, and promoted in digital and networked environments. As noted above, technologies and digital platforms that children are regularly exposed to, including those that are designed for and targeted at them, fail to prioritise the best interests of the child.
Furthermore, those involved in designing, developing, or implementing such technologies must provide children with the opportunity to participate in these processes. This also goes for the regulation of these technologies, including through the courts. Children and young people should not face barriers or obstacles to accessing justice and participating in litigation that seeks to vindicate their human rights. Digital rights are also children’s rights, and DFF is open to supporting the strategic litigation work that might be taken by child activists and their allies in the future.