The gig economy and “platformisation” of labour have become intertwined with core digital rights issues. How can the field best support workers on the ground, who are struggling against increased surveillance, data collection and algorithmic management?
At the Digital Freedom Fund’s online annual strategy meeting in February, DFF expanded its usual scope of attendees to also include communities outside of the digital rights field.
Activists working on civil and economic rights, as well as gig workers, were included in the various discussions highlighting the way in which technology now impacts the advocacy of a much broader range of organisations.
In a session on the gig economy and platform-worker relations which I had the opportunity and privilege to facilitate, our discussions focused on (a) algorithmic management and the increasing use of surveillance technologies on workers in the gig economy, (b) the need to engage with communities directly impacted by those technologies and to cultivate reflexivity within civil society, researchers and academics, and (c) building worker power through control over their own data.
The session took place just days before the Supreme Court ruling in the United Kingdom where the court upheld an earlier employment tribunal ruling that Uber drivers under UK law are “limb (b)” workers, rather than self-employed contractors.
In practice, this means that drivers will now have basic protections such as minimum wages, holiday pay, pension, protection against discrimination, union rights, whistleblower safeguards and so forth, but there is still no right to protection for unfair dismissal. Additionally, enforcement and compliance of the ruling has yet to be determined.
…drivers will now have basic protections such as minimum wages, holiday pay, pension, protection against discrimination, union rights, whistleblower safeguards
In parallel, cases on gig workers exercising their General Data Protection Regulation rights against Uber and Ola were pending before the Amsterdam District Court.
The questions raised here include access to data and transparency in algorithmic management related to performance and “robo-firing”. Drivers want to gain insight to calculate their minimum wages, prove an employment relationship, determine if there is discrimination, build collective bargaining and advocacy and lastly, for creating a data trust to ultimately give them more power over their working conditions.
These cases highlight that as workplaces are becoming increasingly augmented by the use of surveillance technologies, it becomes pertinent to ensure that people are protected from harms caused by the use of these technologies. Some of the issues raised in the gig economy can also be a lesson on how we can begin to address some of the complexities of workplace surveillance in general.
This is all the more important as, in the recent draft EU Regulation on Artificial Intelligence, workplace protections are at risk of being watered down. Indeed, across many tech regulations, protection for workers tends to be downplayed or ignored.
In the ride-hailing and delivery sectors, there has been a particular increase in the use of surveillance technologies, including facial recognition technology for real-time ID checks.
In the ride-hailing and delivery sectors, there has been a particular increase in the use of surveillance technologies
The deployment of these technologies is often driven by regulators attempting to tackle road safety and security.
For example, in the case of Uber UK, Uber’s initial commercial license was reinstated by the regulator, Transport for London (TfL), in 2020, when the company promised to roll out facial recognition technology – enabled by Microsoft’s face-matching software – to ensure safety checks of drivers.
The licence had been cancelled because of issues related to fraud, insurance and safety. However, many drivers have experienced the roll out of these technologies as disproportionate, racist and discriminatory. When drivers – often people of colour – fail their facial recognition checks, they are reported to TfL and may lose their license to drive. The App Drivers and Couriers Union (ADCU) has identified seven cases of failed instances leading to drivers losing their licenses and their jobs.
In general, algorithmic management and surveillance technologies deployed by gig companies on their workforce often lack transparency, with no form of redress through their dispatch systems that decide who gets to work, how jobs are allocated, how wages are calculated, how workers are deactivated or fired for assumed fraudulent activity.
In many cases, worker surveillance technologies have been proven to be unreliable. However, the lack of transparency often means that drivers have no idea why they have been deactivated. These technologies are often discriminatory and cause tangible harm to drivers who usually do not understand the technology behind it and often lack any form of recourse.
While private tech companies and the use of surveillance technologies pose big risks for workers, state and government authorities have also been complicit in abetting these problems.
In the UK, the majority of full-time Uber drivers belong to Black, Asian, or Minority Ethnic (BAME) communities who often bear the brunt of racist and discriminatory actions, inflicted by regulators, and exacerbated by private companies that utilise these seemingly “neutral” technologies.
…they found out that their social media accounts were being monitored
There is a history of TfL adopting discriminatory rules and procedures that primarily affect marginalised workforces. For instance, when drivers submitted a freedom of information request during the period when private hire drivers were demonstrating against the congestion charge in London (a tax imposed mainly on private hire drivers who come from overwhelmingly BAME backgrounds), they found out that their social media accounts were being monitored.
This highlights the existence of institutional problems and the state’s inability to protect the most precarious workers. Regulators and other public bodies also often introduce new problems through technology-mediated systems.
A broader coalition
If the state is unable to protect its most precarious workers and citizens, how can civil society, academics, and researchers help?
At DFF’s strategy meeting, participants highlighted that digital rights organisations, strategic litigators, and researchers, tend to not have access to communities directedly impacted by algorithmic management and surveillance technologies.
Much of the discussion during the session was therefore focused on the question how we as digital rights organisations, academics, or researchers understand class struggle. In this context, Yaseen Aslam, one of the lead claimants in the UKSC case, shared his and other fellow drivers’ experiences of working for tech companies like Uber and his collaboration with James Farrar of the App Drivers and Couriers Union (ADCU) and Workers Info Exchange (WIE).
Yaseen particularly emphasised the need to have perspectives beyond the digital rights and strategic litigation field, thus ensuring that those that have been directly impacted by the actions and business models of tech companies and the technologies they build are involved in the action.
Yaseen highlighted that civil society should focus on empowering direct action within affected communities and to take care not to hijack or exploit those communities and movements
Equally, participants also stressed the need to cultivate better reflexivity in our own efforts to help. Yaseen highlighted that civil society should focus on empowering direct action within affected communities and to take care not to hijack or exploit those communities and movements.
Change needs to come from within those worker communities, and the value that civil society can provide is to build trust. This inevitably takes time, and should encourage us to think of ways to install the channels and capacity to help when and where needed.
Empowerment through data
There is much potential for workers to harness their digital rights in order to address power asymmetries that arise from the increasing algorithmic power of tech companies over workers in the gig economy and beyond.
Workers Info Exchange, a non-profit that focuses on helping workers obtain and gain insight to data collected by private tech companies in the gig economy, is one key organisation fighting to claim back workers’ agency through building a data trust for workers.
With the recent judgments at the Amsterdam District Court on workers exercising their GDPR rights, there is now momentum to harness digital rights as avenues to explore legal challenges, as well as to use digital rights as a springboard for collective action and organising.
Activists, researchers and groups in the digital rights sphere are well-poised to contribute in aiding and sustaining this momentum.
As digital rights organisations, we can take several steps to broaden our opportunities for collaboration. In particular, we should:
- gain a better understanding of how workers identify with algorithmic control and related discrimination issues and how to develop legal (and non-legal) practices to help with these issues;
- identify common ground related to the use of technologies between different sectors within the gig economy;
- centre the struggles and issues of affected communities by taking time to invest and build trust in order to instil capacity to empower direct action from the worker communities’ themselves;
- expand digital rights into other areas and connect with communities on the ground; and
- contribute to the efforts of grassroots organisations and unions working on data rights as well as doing a better task at explaining digital rights to workers in ways and language that are accessible.
Ultimately, the protection of workers’ rights should not solely fall upon members of already marginalised communities, who, at any rate, are often the proverbial canary in the coal mine.
…the protection of workers’ rights should not solely fall upon members of already marginalised communities
While laws exist to protect (some) individuals, the enforcement of many of these laws is failing. Hence, it is important to think about the limits of the law, and also how the law can harness or support existing efforts in building collective action.
Individuals and communities in the digital rights space have the knowledge, network and skills to be part of building worker power in the gig economy. We should use this to collaborate with impacted communities to protect them and everyone from unequal power dynamics.
Jill Toh is a PhD candidate at the Institute for Information Law, University of Amsterdam.