How Copyright Bots are Governing Free Speech Online
How Copyright Bots are Governing Free Speech Online
Police officers playing pop music in an effort to interrupt live streams of police operations. Autocratic regimes using social media’s upload filters to block critical reporting. “Reputation management” companies making fake copyright claims to erase unflattering news reports about their clients from search results. Unidentified fraudsters blocking reporting on a party’s political event by posing as a TV station.
These are all real examples of state or private censorship based on automated copyright enforcement online.
Digital rights organisations are already alarmed about the increasingly common voluntary use of upload filters by social media companies to help copyright holders remove alleged infringements from their services more efficiently.
Due to the inherent inability of such filters to detect legal uses such as quotations or fair use, as well as the lack of authoritative information about who is a legitimate rightholder for which copyrighted work, the automated removal of legal expression by online platforms is a common occurrence.
Digital rights NGO Electronic Frontier Foundation is cataloguing such (usually fully automated) wrongful removals in its Takedown Hall of Shame.
Now, a recent EU reform, the Directive on Copyright in the Digital Single Market, risks making the deployment of these technologies mandatory for a broad range of commercial platforms. This poses new challenges for fundamental rights activists and legal scholars, as traditional freedom of expression frameworks, which assume a direct state intervention causing the suppression of speech, are often too simplistic to deal with these multi-sided scenarios.
How can we use free speech frameworks to challenge censorship through copyright enforcement?
That’s why the Digital Freedom Fund devoted a workshop at its 2021 strategy meeting to the question: How can we use free speech frameworks to challenge censorship through copyright enforcement?
I had the pleasure of facilitating this workshop and sharing lessons learned from my own strategic litigation project “control ©”, hosted by German NGO Gesellschaft für Freiheitsrechte (GFF).
Conceptualising freedom of expression
Participants in the workshop examined the specific cases mentioned above to identify common threads and discuss potential avenues for litigation in order to protect the freedom of expression of Internet users.
What all these cases have in common is that they take place within what Jack Balkin has termed the “free speech triangle”, that is, in multi-sided platform environments where online platforms act as a third player in freedom of expression issues next to the state and the individual. But in the copyright context, even this three-sided model is too simplistic, given that (real or purported) copyright holders, or even individuals playing copyright-protected music in a live-stream, now use the automated filters established for copyright protection to restrict internet users’ freedom of expression.
While the large number of actors involved in copyright censorship cases increases their complexity, it also opens up new potential avenues for litigation. Aside from constitutional complaints against laws that mandate the use of upload filters, litigation against private actors engaging in or facilitating private censorship through copyright enforcement tools is also conceivable.
Copyright law itself can be a basis for challenging wrongful removals of legal expression by upload filters. If the removal is based on a blocking request from an unauthorised third party, the real author of the work in question could rely on their author’s right to attribution to take legal action against the impostor. However, this approach is only feasible if the impostor can be identified.
Copyright exceptions can also be invoked against the platform if the correct rightholder has requested the automated blocking of their work, but the copyright enforcement mechanism has failed to take into account legal uses such as quotation or parody. Unfortunately, even if the victim of unjustified blocking can show that their use is legal under copyright law, it is unclear whether a platform is required to protect it from being arbitrarily blocked by its upload filters.
…a platform may rely on its private terms and conditions, rather than the law, as a justification for removing content
For example, a platform may rely on its private terms and conditions, rather than the law, as a justification for removing content on request of rightholders and may ignore the existence of copyright exceptions in the process.
EU law also lacks a harmonised notice and action regime that could regulate platforms’ obligations to correct its blocking decisions, although the European Commission’s proposal for a Digital Services Act may soon change that.
Ironically, the otherwise rather problematic EU Directive on Copyright in the Digital Single Market (DSM Directive) will bring some much-needed clarity for users on this point, as it turns the copyright exceptions into enforceable users’ rights that platforms and rightholders must respect in their automated copyright enforcement systems.
…upload filters are incapable of distinguishing between infringements and legal uses under exceptions
Unfortunately, the directive fails to explain how this should work in practice, as upload filters are incapable of distinguishing between infringements and legal uses under exceptions. Simply stating that legal content must not be blocked and leaving the responsibility of actually designing those central fundamental rights safeguards to the EU Member States does not meet the requirements of the Charter of Fundamental Rights, our GFF study on the fundamental rights compliance of the DSM Directive concludes.
The role of the state
For litigation purposes, it is important to identify whether state intervention has played a role in the blocking of content and to analyse whether such intervention – directly or indirectly – led to the suppression of speech, the traditional scenario of restrictions on the fundamental right to freedom of expression.
As long as laws such as the DSM Directive are not yet applied and the use of upload filters by online platforms remains voluntary rather than a legal obligation, state intervention in these cases of suppression of speech is not self-evident.
Nevertheless, some of the specific cases examined in the workshop still involve state action. In the case of Turkish exiled journalists whose YouTube channel was shut down as a consequence of multiple copyright strikes, the false copyright claims originated from the Turkish state TV company TRT.
The journalists suspect that the Erdoğan administration has been using state-owned TRT’s status as a large rightsholder … to silence critical reporting on the government
The journalists suspect that the Erdoğan administration has been using state-owned TRT’s status as a large rightsholder, which gives it access to YouTube’s ContentID filtering system, to silence critical reporting on the government. In the multiple examples of police officers trying to interrupt live streams by playing pop music on their phones, participants found that the actions of those individual officers still constitute state action, even if the copyright enforcement system they try to exploit is used by social media platforms on a voluntary basis.
Even when no state action is involved, social media platforms could still have indirect fundamental rights obligations toward their users.
Polish NGO Panoptykon is already engaged in strategic litigation against Facebook over the arbitrary blocking of an NGO’s Facebook page. Recognising the increasing importance of social media for the exercise of freedom of expression online, the European Commission’s proposal for a Digital Services Act includes an obligation on online platforms to pay due regard to the fundamental rights of users when enforcing their private terms and conditions (Art. 12(2)).
This new provision could strengthen strategic litigation against the blocking of lawful expression by automated copyright filters employed by platforms on a voluntary basis. In order for this provision to be effective, it must be coupled with strong transparency obligations and provisions on collective redress that allow users’ rights organizations to take legal action against structural overblocking of legal content by overzealous copyright filters.
Why awareness is key
A central outcome of the workshop is the realisation that all workshop participants were aware of examples of copyright censorship in their own countries, but that those cases rarely gained international prominence.
…it is crucial to build stronger narratives around the frequent mistakes and abuses of upload filters
In order to raise the awareness of policy-makers and the leadership of platforms companies about the danger of copyright filters for freedom of expression, it is crucial to build stronger narratives around the frequent mistakes and abuses of upload filters.
Building on the experience with other excellent mapping efforts in the digital rights space, such as Austrian privacy NGO noyb’s knowledge wiki GDPRHub and EFF’s Takedown Hall of Shame, a complementary mapping of copyright censorship cases in the EU was envisioned.
Such a database could form the basis for an awareness-raising campaign and also make it easier for the victims of copyright censorship to connect with fundamental rights NGOs that may help them seek justice.
Felix Reda is a copyright expert and project lead at Gesellschaft für Freiheitsrechte (GFF).