CJEU says upload filters must respect user rights – but what if they don’t?
Two years after the adoption of the EU Directive on Copyright in the Digital Single Market (DSM Directive), the EU’s highest court has ruled on the compatibility of its most controversial provision with the Charter of Fundamental Rights.
In its judgment in case C-401/19, the European Court of Justice finds that Article 17 of the DSM Directive, which requires certain online platforms to use upload filters to prevent copyright infringements by their users, can be reconciled with users’ right to freedom of expression under Article 11 of the EU Charter of Fundamental Rights. To meet the requirements of the Charter, however, the Member States and national courts must interpret the provision in a manner that prevents legal uploads, such as uses under copyright exceptions, from being blocked.
This is easier said than done: years of experience with the voluntary use of upload filters, such as AudibleMagic, YouTube’s ContentID, or Facebook’s Rights Manager, have shown that copyright enforcement algorithms routinely block legal forms of expression such as parodies or quotations, users’ recordings of classical music in the public domain or original works that are subject to false copyright claims. In some cases, these vulnerabilities have even been deliberately used by law enforcement to try to suppress live streams of police operations by playing protected music in the background. Article 17 has long been criticized for making these error-prone upload filters mandatory, which triggered the Polish government’s challenge of the provision before the CJEU on freedom of expression grounds. This article examines the opportunities and challenges that arise from this judgment for strategic fundamental rights litigation.
Platforms can’t take on the role of judges
While the Court upheld Article 17 in its recent judgment delivered on April 26, 2022, it made it very clear that it is the job of the EU Member States to ensure that the use of upload filters is limited to situations where the risk of blocking legal content is minimal. Platforms must be able to fulfil their obligations under the law without having to make independent assessments of the legality of user uploads – otherwise, the national implementations of Article 17 would violate the ban on general monitoring obligations.
Consequently, of the roughly half of the EU Member States who have transposed Article 17 into national law to date, very few – if any – meet the requirements set by the Court. Germany and Austria are the only Member States that have defined any ex-ante safeguards against the automated blocking of legal user uploads. Most Member States have simply re-stated the provisions of Article 17 in their national laws, which the CJEU appears to deem insufficient to protect the users’ right to freedom of expression. Some Member States, like Italy and Spain, have even deviated from the wording of Article 17 at the disadvantage of platform users, by requiring that content must be blocked even while a user complaint about a potentially wrongful blocking decision is under investigation.
Bringing national laws in line with freedom of expression
The CJEU judgment is to be welcomed for clearly ruling out blocking requirements that restrict legal forms of expression by platform users. However, it has also created an untenable situation: Although most national implementations of Article 17 are clearly in violation of the right to freedom of expression as interpreted by the Court, the judgment does not automatically render those national laws inapplicable – although it does require national courts to interpret those national laws per the Article 17 judgment. However, most user rights violations never see their day in court. Instead, when users are confronted with opaque blocking decisions by platforms, they are often unaware of their rights or lack the resources and motivation to take legal action.
The task of ensuring that national laws are brought back in line with the Charter then falls to civil society. Similar to civil society’s strategic litigation efforts against national data retention laws, it will be necessary to take legal action in individual Member States whose upload filter provisions lead to the blocking of legal uploads. The following strategies can be employed to ensure that the standards set by the CJEU for the protection of user rights will prevent infringements of freedom of expression in practice.
Strategic litigation against national upload filter laws
The most obvious action against a national law that is in violation of fundamental rights is a constitutional complaint. However, the requirements of admissibility of a constitutional complaint may vary depending on the Member State in question. All Member States will have an equivalent to the right to freedom of expression and information enshrined in their national constitutions, or their constitutional courts may also accept complaints based on violations of the Charter. The complaint will have to be made by an entity that is directly affected by the law, such as a user whose legal upload has been blocked or a platform operator who is required to use upload filters, or who is required to independently assess the legality of user uploads.
Before filing a constitutional complaint, national procedural law may require the affected party to seek redress through the civil courts first. Such civil law proceedings also open the opportunity for preliminary questions to the European Court of Justice. When the outcome of a national court case depends on the interpretation of the EU law, the national court of the last instance is required to refer the question to the European Court of Justice for a preliminary ruling. This may be a useful strategy to encourage the CJEU to expand on the nature of the obligations that the Member States have when transposing Article 17 into national law, as it is far from obvious how they are supposed to prevent the blocking of legal uses by upload filters in practice.
Beyond constitutional complaints
The opportunities for redress through the civil courts vary depending on the affected party who is initiating the litigation. A platform operator who considers that the obligations placed on it by national transpositions of Article 17 are disproportionate may have few options available to it other than refusing to implement a blocking obligation that fails to meet the requirements of the Charter at the risk of being targeted by litigation from rightsholders. Given the drastic consequences of a failure to comply with Article 17 obligations – direct liability for users’ copyright infringements – and the comparatively high claims for damages in copyright law, this is a risky strategy for platform operators, especially smaller ones. Depending on the national legal system, platform operators may have options for preventive legal protection at their disposal.
In most cases, civil society is more likely to approach the issue of overblocking in partnership with affected platform users, rather than platform operators. Article 17 itself opens the possibility for users to take legal action against platform operators in cases of overblocking by requiring platforms to guarantee users’ rights to exercise exceptions and limitations to copyright law in their terms and conditions. Platform users affected by overblocking in any Member State that has transposed Article 17 into national law should therefore be able to sue a platform for either violating its own terms and conditions or failing to design its terms and conditions following the law.
Action against false copyright claims
Article 17 is less prescriptive when it comes to legal action against false or overbroad copyright claims by (alleged) rightsholders. Both users who are adversely affected by overblocking and platforms that receive unsubstantiated blocking requests may have an interest in putting a stop to this practice. Unfortunately, Article 17 only abstractly requires that the cooperation between rightsholders and platforms may not result in the blocking of legal content, without defining enforcement mechanisms to achieve that goal. Article 17 further requires rightsholders to submit the “necessary and relevant” information to platforms.
Following the CJEU’s clarification that platforms may not be required to make an independent assessment of legality, it appears that rightsholders may have to do more to substantiate their blocking requests than simply to identify the works in which they hold exclusive rights. Where rightsholders fail to meet those requirements, national civil law may offer those adversely affected by false copyright claims mechanisms to sue against the unjustified invocation of intellectual property rights. However, when the damage suffered by platform users is difficult to quantify in material terms because it is primarily the exercise of their freedom of expression that is curtailed, rather than any economic activity, those civil law mechanisms may be found lacking. The main goal of invoking such mechanisms may not be obtaining damages in the individual case, but rather bringing the important open questions over the compatibility of existing national Article 17 implementations before the CJEU for a preliminary ruling.
Making use of national Article 17 safeguards
While most Member States have failed to provide users with meaningful safeguards against overblocking, a few instruments can be found in the German implementation of Article 17 that may be useful to civil society in safeguarding use rights. The main goal of invoking those safeguards would be to ensure that the use of upload filters is limited in practice to meet the requirements of the Court, rather than challenging the validity of the national copyright law as such. The most notable provision in that regard is a right to collective redress that user rights organisations can invoke against platforms in cases of systematic overblocking. Another important element is the right for researchers to obtain information on the content moderation practices of platforms – after all, whether or not the user rights safeguards are sufficient to prevent the blocking of legal content is an empirical question. While there is plenty of anecdotal evidence of overblocking, a systematic assessment of the extent of the problem requires platform data: only they know how often user uploads are blocked before publication.
It will probably take years of litigation to ensure that users’ rights under the new EU copyright law are enforced in practice. At this point, the Court’s requirement that upload filters may not lead to the blocking of legal content appears largely aspirational. Nevertheless, by defining strict limits to the fundamental rights compliance with filtering obligations, this judgment forms a basis for strategic litigation to protect freedom of expression against algorithmic overblocking.
Felix Reda is a copyright expert and project lead at Gesellschaft für Freiheitsrechte (GFF).