Access to justice in data protection: the more things change, the more they (can also) stay the same.
This blog post is part of a three-part series on access to justice in data protection and a recent DFF workshop held in December 2022 which focused on this topic. You can check back here in the following weeks for the next instalments.
As Data Protection Day rolled around again this past 28th of January, the topic garnered a heightened level of ubiquity, as it commonly does in the wake of its commemorative date. During a few days, its trending levels might have even rivaled the steamrolling hype machine around AI and ChatGPT making the rounds lately.
By some accounts, and based on some recent developments outlined below, 2023 might be a momentous year for data protection. However, as you read this, many of you might immediately think to yourselves that you have heard similar prognostics thrown around many times before, maybe every year around this date since 2016 when the General Data Protection Regulation (GDPR) entered into force.
Conversely, as the GDPR has gradually shown signs of not delivering its original promise, another perennial feeling which seems to repeatedly emerge in the data protection space paints a different picture.
This could probably be best described by French satirist Jean-Baptiste Alphonse Karr’s aphorism “plus ça change, plus c’est la même chose” (the more things change, the more they stay the same). This rings particularly true when thinking about access to justice in this field.
To be sure, in the year ahead there are certainly some interesting prospects and undeniable potential for significant progress in the data protection landscape. However, there is also plenty of room for healthy skepticism that any of this will generate meaningful shifts when it comes to the abusive, intrusive, and extractive data practices of Big Tech companies, or systemic changes which might result in an effective protection of our fundamental right to privacy and the many other interconnected human rights which intersect with it.
For example, new and upcoming EU legal frameworks such as the European Declaration on Digital Rights and Principles for the Digital Decade, the Digital Services Act (DSA), the Digital Markets Act (DMA), the Collective Redress Directive and the Artificial Intelligence Act (AI Act), to name a few, could mean that new avenues to access justice in data protection are indeed on the horizon in 2023 and beyond. Additionally, the ripple effects of a series of decisions and events that took place in 2022 – which ended with quite a bang! – are still being felt and will probably shape further developments that unfold throughout the year.
On the flipside, it’s also true that many of the enforcement shortcomings and the myriad of obstacles in terms of access to justice that have haunted the GDPR in its traverse remain firmly in place with little to no end in sight. The same goes for the structural and material conditions at the root of most data-driven societal harms, especially those that disproportionately affect marginalised communities.
There are many examples where we can see a reoccurring scenario in which what appear to be seismic events, end up being more akin to a flash in the pan, while the status quo perpetuates itself pretty much unfettered, and structural power imbalances are further entrenched, reproduced or exacerbated through data processing practices.
Take for example what is probably the most paradigmatic case, and one of the most noteworthy developments that came about during the eventful last days of 2022. On the 13th of December, the European Commission issued its draft adequacy decision for transatlantic data flows, which will now usher in a new EU-US Data Privacy Framework.
After two previous failed attempts struck down by the Court of Justice of the European Union (CJEU), many civil society organisations in Europe and the US have already expressed doubts and concerns about the level of data protection offered by the Executive Order on which the agreement is based on, vis-à-vis US intelligence agencies’ known and covert bulk data collection methods. Moreover, there are substantial differences between both jurisdictions around essential concepts such as what constitutes a “court”, or the meaning of “proportionality”. Hence, it’s quite predictable that, just like its two predecessors, this iteration of a data transfer regime will likely end up being challenged before the CJEU as well, in what has now seemingly become a Sisyphean endeavour.
The beginning of the end for surveillance capitalism, or just new beginnings for different means to the same end?
Another notable event that caused a commotion as 2022 came to an end was when news broke that the European Data Protection Board (EDPB) had issued decisions against Meta Platforms Inc. on its data processing practices for personalised ads. The decisions have now been published and impose fines totaling EUR 400 million between Facebook and Instagram.
Although these are among the highest GDPR fines ever imposed – and significantly higher than those initially set out by the Irish Data Protection Commissioner (DPC) in its initial draft decision – they amount to little more than slaps on the wrist compared to the revenue obtained by Meta through its – now deemed illegal – personalised ads throughout the past four and a half years. If the fines were based on the 4% global turnover threshold established by the GDPR, it has been calculated that they could have amounted to over EUR 4.36 billion. This would have then actually constituted an “effective, proportionate and dissuasive” fine in line with the spirit of Article 83 of the GDPR.
Surprisingly – or maybe not so much so – the DPC did not take Meta’s revenue into consideration, claiming it was “unable to ascertain an estimation of the matters” despite the fact that this is publicly available information, which the DPC in any case also has the power to obtain under Article 58(1) of the GDPR if it weren’t.
The notion that this soft touch approach by the DPC is the underlying factor behind Big Tech’s forum shopping when choosing Ireland as their main establishment in the EU is of course nothing new, and this case just reinforces this notion.
However, what does set this case apart from others where Big Tech companies have amassed millions of euros in fines, is the fact that the decision also directs Meta to bring its practices into compliance with the GDPR within three months, specifically by allowing users to opt in or opt out of personalised ads, striking a major blow to its business model. Some have gone as far as to claim that this is effectively a slow death sentence for surveillance capitalism.
On the other hand, others are less optimistic and predict that this might actually cause a spike in ads, and at the same time force companies to devise more innovative ways to spy on us. This could result in an increased reliance on inferences based on machine learning, which can sometimes be alarmingly accurate – even with access to limited data and privacy protections in place – while also generating proxy discrimination.
A test case to zoom out and focus on broader access to justice issues
The Meta case also raises broader issues related to access to justice which tend to be prominent in the data protection field as a whole. To begin with, it’s worth noting that the initial complaints in this case were brought forward on the 25th of May 2018, and it has taken more than four years for the DPC to finally issue a decision. This is compounded with the fact that digital rights organisation noyb – which filed the complaints – contends that the case itself was “about a rather simple legal question.”
Data protection authority (DPA) inertia is one of the main bottlenecks in GDPR enforcement, and a predicament faced by complainants in many jurisdictions.
A case in point is noyb’s litigation in Luxembourg to try to tackle this issue. However, the DPC is particularly notorious for underperforming when it comes to handling and deciding cases, as civil society and government reports have previously shown.
To remedy this situation, and to support the DPC with its growing workload, the Irish Minister of Justice announced in July 2022 that two additional commissioners would be appointed in early 2023. Although the current commissioner Helen Dixon would remain as Chairperson, this appears to be a crucial opportunity to infuse some new blood into the organisation, which could in turn potentially generate some changes in enforcement culture within the DPC. We highly encourage lawyers, activists, civil society organisations and journalists to watch that space.
Additionally, during the last days of 2022, another noteworthy decision was issued by the European Ombudsman in response to a complaint filed by the Irish Council for Civil Liberties (ICCL) calling for the European Commission to monitor the progress of GDPR cases. The Ombudsman issued a set of recommendations, which were well received by the Commission, which just recently announced that it will monitor all large-scale cross-border GDPR cases in all EU jurisdictions. According to a statement by the ICCL, this “heralds the beginning of true enforcement of the GDPR, and of serious European enforcement against Big Tech.”
Coming back to the Meta case, it’s also very troublesome how the DPC has repeteadly denied noyb many of its party rights during the course of the proceedings on grounds of “confidentiality”.
The topic of unclear and unharmonized procedural rights is another access to justice chokepoint which must be urgently addressed, as the EDPB itself has recognised in a recent letter to the EU Commission.
And although it seems that winds of change are starting to blow in Sweden when it comes to the recognition of party rights under the GDPR, it remains to be seen if this will have a spill over effect across the North Sea on other neighbouring DPAs and courts.
Additionally, at first glance it might seem that the Meta decision represents a glimpse of what could be a light at the end of the tunnel in terms of functional cross-border cooperation between DPAs. However, upon closer examination, it becomes evident that what the EDPB’s decision actually reveals is a fracture between the DPC and its fellow DPAs across Europe.
To begin with, the DPC’s initial draft decisions deviated considerably from the EDPB’s decision, both when it came to the amount of the fines, but most importantly, regarding the question around Meta’s reliance on the terms and conditions – and not consent – as a legal basis for its processing activities.
Furthermore, the DPC claimed in its press release announcing the decision that the EDPB had incurred in overreach by directing it to “engage in open-ended and speculative investigation” spanning all of Facebook and Instagram’s data processing operations, and would seek to annul the decision before the CJEU.
If this plays out, the result could be that we go from what has been lauded as a landmark decision, back to square one, with things mostly carrying on as they have been – which by now is becoming an overarching theme if you have read this far.
Therefore, it’s important to acknowledge that in order to really confront Big Tech’s business model and the underlying extractive surveillance capitalist ideology that drives it, other strategic work and activism must be carried out in parallel, and on many different fronts.
The adoption of community-driven practices like movement lawyering to harness grassroots community power around these issues, mobilising around radical empowerment ideals, or using frameworks such as the commons as viable alternatives, are just a few examples of how we can leverage the momentum generated by successful legal challenges like this one if we aim to effect meaningful and long-lasting systemic change – assuming, of course, that this is actually the end game that the digital rights community should ultimately be pursuing. Otherwise, these opportunities will likely just become exercises in ineffectual technocratic fixes and adaptive compliance practices devised by Big Tech and those that wield disproportionate power or influence in our digitised society.
Let us not forget that this is precisely what got us here in the first place, when Meta bypassed consent through its terms and conditions as a deliberate maneuver to comply with GDPR as soon as it became enforceable. While the legal basis formally relied upon by Meta shifted, its processing activities did not, and its business model, until now, has remained unscathed. Which, in the end, is proof that the more things change, the more they can also stay the same.
DFF recently held a workshop in Dublin titled “Access to Justice in Data Protection” where civil society organisations convened to share information and strategise around the issues outlined above, as well as a wider range of important topics in the field. You can learn more about what transpired during the workshop in the next instalments of this blog series, as well as by checking out this video.
César Manso-Sayao is the Legal Officer at Digital Freedom Fund. He is a law graduate from the University of Costa Rica, and holds two Master’s degrees from the University of Barcelona, one in Human Rights and the other in Sociology.