Access to justice in data protection: the more things change, the more they (seem to continue to) stay the same

Access to justice in data protection: the more things change, the more they (seem to continue to) stay the same

By Cesar Manso Sayao, 25th May 2023

Illustration of a statue holding a balance scales and sword with a blindfold over its eyes
Artwork by Cynthia Alonso for the Digital Freedom Fund

As today marks the fifth anniversary of the General Data Protection Regulation’s (GDPR) entry into application, it seemed apropos to also publish the second part of our “Access to justice in data protection” blog series. The first part of this series came in the wake of Data Protection Day on the 28th of January, after a workshop on access to justice in data protection we held in Dublin during December of last year.

In this previous instalment, we focused on some of the topics discussed in our workshop, including major bottlenecks for GDPR enforcement such as data protection authority (DPA) inertia and the lack of harmonised procedures in cross-border cases, as well as some noteworthy events that occurred during that time, such as the the European Commission’s issue of a draft adequacy decision for transatlantic data flows between Europe and the United States, and the Irish Data Protection Commission’s (DPC) decisions on Meta’s unlawful personalised advertising.

One recurring theme throughout that blogpost, which we will now re-visit in this one, is probably best described by French satirist Jean-Baptiste Alphonse Karr’s famous aphorism “plus ça change, plus c’est la même chose” (the more things change, the more they stay the same), which rings particularly true when thinking about access to justice in the data protection space.

Let’s take a look, for example, at some recent developments regarding some of the topics outlined above, two of which have precisely been announced to coincide with the GDPR’s fifth anniversary. The first has to do with the lack of harmonised procedures in cross-border cases. In a statement published by the EU Commission yesterday ahead of the GDPR’s anniversary, it was announced that it will soon propose new legislation to address these shortcomings. Although this could possibly be beneficial in resolving some aspects of this particular enforcement choke-point, it remains to be seen if more regulation will actually lead to better enforcement. We might just end up with more ineffective regulation and no significant changes in the data protection enforcement landscape, as we have until now with the GDPR.

The second is the DPC’s decision and record-breaking EUR 1.2 billion fine on Meta over its transfers of personal data between Europe and the United States, which was strategically delayed until this week and issued just a few days ago. Although this fine might be the highest in GDPR enforcement history, it still represents only around 1% of Meta’s annual revenue, a far cry from the 4% maximum fine applicable under the regulation. Furthermore, considering what it has taken to finally arrive at this decision, it’s more of an example of enforcement not working, and does nothing to change the fact that the GDPR, which once heralded as a gold standard, has in the end turned out to be more of an enforcement flop.

Meanwhile, Meta has announced that will appeal the decision, which would buy it time to eventually base future data transfers on the upcoming EU-U.S. Data Privacy Framework. However, the adequacy decision on which this data transfer regime is based on has recently been heavily criticised by the European Parliament, and if there is no significant shift in U.S. mass surveillance powers, it is likely to be eventually invalidated by the European Court of Justice (CJEU), just like the two previous frameworks, in what is now starting to seem like a prelude to a Groundhog Day-esque tragicomedy.

In the end, despite these recent developments and the many that have preceded them in the GDPR’s five year traverse, there have been no significant changes in the protection of personal data from the EU transferred to the U.S. vis-à-vis its surveillance infrastructure and powers, or any level of enforcement that has led to a systemic change in Big Tech’s intrusive and extractive data practices, let alone a threat to its business model.

However, with the transposition of the EU Collective Redress Directive which recently entered into force, representative actions will be a new legal avenue which might finally unlock meaningful GDPR enforcement by harnessing collective power before the courts instead of through mostly ineffectual decisions and fines issued by DPAs. During the remaining of this year and throughout next year, collective redress will be one of the main focus areas of our events, so watch this space if the topic interests you.

Artwork by Noa Snir

Broader scope, similar dynamics

Although the GDPR enforcement issues outlined above are evidently very prominent in discussions around access to justice in data protection, they are obviously not the only ones. During our workshop at the end of last year, we also sought to broaden the scope and look at this topic in a different light, with a stronger focus on how historically oppressed and marginalised communities are disproportionately affected by violations to their data protection rights. To that end, we held two peer-led sessions on social protection and gender-based violence within the event.

On the topic of social protection and the digital welfare state, our workshop included a presentation by Privacy International (PI) on a recent case in the UK involving the National Fraud Initiative (NFI). In August 2022, after a public consultation process, the Government decided not to extend the NFI’s existing data matching powers to new purposes, therefore limiting state power to carry out risk profiling to detect fraud in social services. This was initially met with a positive response by the digital rights community. However, this ephemeral sign of progress has subsequently been undermined. Last December, a report by PI – one of the four public interest groups which responded to the aforementioned consultation – highlighted the fact that shortly after the decision not to extend the NFI’s matching powers, the government then launched the Public Sector Fraud Authority (PSFA), which is a brand new “fraud squad” that will introduce a National Counter Fraud Data Analytics Service (NCFDAS).

As PI concludes in their report: “Subjecting communities in vulnerable situations to ever-evolving, poorly scrutinised and little publicised digital tools to facilitate their surveillance amidst a cost of living crisis is far from a vision of the economy that aims to support those most in need.” We whole-heartedly agree.

This is reminiscent of the SyRI case in the Netherlands, where a landmark ruling struck down the Dutch government’s use of welfare fraud risk-scoring algorithm “SyRI” on the grounds that it violated the right to privacy and unfairly targeted people based on the neighbourhoods they lived in and their socio-economic background. Nevertheless, despite this important legal precedent (and the political landscape-shattering childcare benefit scandal which preceded it) predictive policing through risk scoring is once again rearing its ugly head with a vengeance in the Netherlands, through an even more invasive system dubbed “Super SyRI”.

As mentioned above, our workshop also included a session on gender-based online violence, focusing on its intersection with privacy and data protection rights such as image-based sexual abuse, doxing, impersonation, hacking and cyber-stalking, which are areas where access to justice also remains mostly elusive. One of the issues addressed in the session was the recent addition in June of last year of “revenge porn” into the Greek Criminal Code. To some, this might seem like a pivotal legal development to combat online image-based violence and help pave the road to justice for victims/survivors.

However, according to DATAWO – a Greek NGO working in this area– the narrow conceptualisation of “revenge porn” in the law overlooks the more common, everyday forms of image-based sexual abuse that actually occur in or originate from digital spaces. Furthermore, this type of measure also disregards the recommendations by specialized international bodies on these matters, which call for a multisectoral approach instead of a merely punitive one when it comes to gender-based online violence.

DFF Access to Data Protection workshop in Dublin, 2022

For example, a submission from the Due Diligence Project to the Office of the High Commissioner for Human Rights clearly lays out what access to justice on this issue should look like, stating that “merely criminalizing online violence does not necessarily provide the remedy required by online violence victims/survivors. Experience has shown that women’s access to justice should be a mix of criminal, civil and administrative processes and include the areas of all the 5Ps, namely in prevention of online violence; protection of victims/survivors; prosecution and punishment of perpetrators and provision of redress and reparation for the victims/survivors.

The UN Special Rapporteur on violence against women and girls has also issued a very comprehensive report in 2018 on online gender-based from a human rights perspective, with recommendations for States – which echoed those proposed above by the Due Dilligence Project– as well as for internet intermediaries. Nevertheless, although the issue has received this type of high-level recognition appropriate measures and safeguards are still missing in jurisdictions all across the board.

Risk profiling in the context of the digital welfare state and gender-based online violence are not only examples of how issues around access to justice in data protection can directly affect specific marginalised communities, but also of how they do so intersectionally.

For example, in the SyRI risk scoring case, the Dutch court concluded that by targeting poor neighbourhoods, the system could also be discriminatory on the basis of race, socio-economic or migrant status. And when it comes to gender-based online violence, recent studies show the disproportionate negative impact that these dynamics have on, for example, Black women or LGBTQI+ communities.

We feel it is crucial to highlight these very tangible harms on historically oppressed or marginalised communities in order to visibilise them and paint a more complete picture of what lack of access to justice in data protection looks like. At the same time, by bringing these to the issues to the fore in our workshop, we hope the digital rights community can increasingly engage in solidarity with those affected by these egregious data driven harms, while at the same time mobilising to achieve justice by embracing more community-centred approaches.

Ultimately, the end goal should be to generate greater collective awareness and community power in order to spearhead data protection enforcement and access to justice, which can in turn materialise in significant societal changes, especially when it comes to the data processing practices of Big Tech and governments. After 5 years of mostly flawed GDPR enforcement, we have yet to see any of this meaningfully occur.