Fighting for Free Speech Online: SIN vs Facebook

Fighting for Free Speech Online: SIN vs Facebook

By Dorota Głowacka, 27th August 2021

A lawsuit from Polish civil-society organisation SIN is challenging Facebook’s opaque and arbitrary content moderation practices to ensure social media users are protected from private censorship and have procedural safeguards online.
Earlier this year, we explored a range of digital rights topics in video format as part of the Digital Freedom Fund’s Digital Rights Around the World Series. We are now revisiting some of those topics with updates from discussions that happened during our 2021 Strategy Meeting and additional developments since then.

Facebook and other online platforms have a huge influence over our online environment, and are increasingly behaving in ways that amount to private and arbitrary censorship. Content moderation decisions by online platforms are also often invisible to users, presenting a challenge for users who want to contest the removal of their content. 

Our team at Panoptykon is supporting civil-society organisation SIN to fight back against Facebook, and force them to respect social media users’ rights to free speech and due process. 

SIN is a Polish non-profit organization that provides drug education and support to drug users by cautioning against the harmful effects of psychoactive substances. SIN’s focus on harm reduction, as opposed to drug prohibition, is a drug-prevention strategy supported by institutions including the United Nations, the European Union, and the Red Cross. SIN relies on its social media activity to reach young people who might otherwise be hesitant to take advice from experts and teachers. 

In 2018, Facebook removed SIN’s Facebook page and private Facebook group without warning because they were found to violate Facebook’s “Community Standards.” Facebook didn’t provide SIN with a reason for this action, nor with any meaningful way to contest the page and group’s removal.

The removal of these accounts made it very difficult for SIN to communicate with their audience and conduct their online outreach work

In January 2019, one of SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed. The removal of these accounts made it very difficult for SIN to communicate with their audience and conduct their online outreach work. This undermined their ability to carry out their educational activities and other statutory tasks in many ways.

While SIN has since set up a new Facebook page, they have had to rebuild their entire community of users. Because Facebook did not provide an explanation for the page removal, SIN operated under the constant uncertainty that their new page could similarly be removed. 

SIN sued Facebook in May 2019 to challenge this censorship

With the help of Panoptykon and Wardyński and Partners law firm (working pro bono on this case), SIN sued Facebook in May 2019 to challenge this censorship. SIN want their page and accounts reinstated and a public apology from Facebook. 

The case is still ongoing, but we already had some success with the District Court in Warsaw temporarily prohibiting Facebook from removing new posts, fan pages, and groups run by SIN on their current account. The court has also obliged Facebook to retain all of SIN’s removed content so that – if SIN wins the case on the merits – that content, along with followers and comments by other users, can be restored.

…it is an important first step towards holding the social media company accountable for excessive and opaque content removal practices

While Facebook appealed this decision, an appellate court in May affirmed Facebook’s obligation to allow SIN to continue publishing new posts without the risk of removal for the duration of the trial, and to store SIN’s original accounts and contents. The decision is now final. Even though this interim measures ruling does not prejudge the final outcome of the case, it is an important first step towards holding the social media company accountable for excessive and opaque content removal practices. 

We are also using this case as an example to push for wider changes to content removal practices and regulations. At DFF’s 2021 strategy meeting, participants discussed similar cases of opaque or indiscriminate content removal practices, as well as successful legal challenges against them. Governments and regulators need to put power back in the hands of users, by requiring platforms to both explain the reasoning behind specific content removal decisions and provide users with a means to appeal those decisions.

Facebook established its Oversight Board last year in an effort to provide users with transparency on, and an appeals process for, their content moderation decisions. But the Oversight Board has also faced criticism over its limited mandate, and its legitimacy as a Facebook-funded and Facebook-appointed entity. This has even prompted a group of journalists, academics and civil rights activists to form “The Real Facebook Oversight Board” to challenge Facebook’s practices. 

We hope that the impact of our lawsuit against Facebook will be felt by social media platforms and will prompt them to make their moderation policies more transparent and fair  

We hope that the impact of our lawsuit against Facebook, and other lawsuits like it, will be felt by social media platforms and will prompt them to make their moderation policies more transparent and fair. In the long run, we also hope to contribute to regulatory changes that will enable users to have better control of what they can see and share online.

Dorota Głowacka is a lawyer at the Panoptykon Foundation, specialising in human rights in the context of new technologies and responsible for coordinating strategic litigation activities.