Freedom of Information Law in the Age of Opaque Machines

By Divij Joshi, 7th May 2021

Blue network of lights against dark background

In the last few decades, landmark freedom of information or right to information (FOI / RTI) and public records laws around the world have radically transformed the role of the citizen.

The citizen has gone from a passive receiver of “official” statements to an active agent able to interrogate the claims and decisions made by her government, by asking for files, records, information or inspection of the government archives. 

However, developments in the era of “digital government” may be unravelling the progress made by FOI laws and movements.

Evidence from across the world indicates that the increasing use of automated decision-making systems to determine important matters of public policy, ranging from policing to urban planning, is encroaching upon FOI law’s mandate of transparency.

At DFF’s annual strategy meeting in March, 2021, a group of concerned activists and academics, including myself, gathered to understand how strategic litigation and advocacy might advance the cause of freedom of information laws and the public values of transparency and accountability of automated decision-making.

…what should civil society be aware of regarding the impact of automated decisions on FOI laws?

So what should civil society be aware of regarding the impact of automated decisions on FOI laws, and what can it do to ensure the integrity of this tool of trust and transparency?

Part of the reason for this inadequacy in FOI laws owes to the nature of the underlying technologies used in automated decision-making. Consider, for example, the use of complex machine learning (ML) systems, widely used by government agencies today, in functions ranging from policing (e.g. facial recognition systems), to the adjudication of welfare claims. 

In many cases, the material documentation (i.e. what is recorded) of ML systems may not adequately capture the functioning of an automated decision-making system. Moreover, many of these systems possess an inherent opacity – their precise logics are unknowable even to the people who have designed them. The opacity of contemporary algorithmic systems used by governments has been well documented, and poses a substantial barrier to FOI processes.

…many of these systems possess an inherent opacity – their precise logics are unknowable even to the people who have designed them.

Another reason is the manner in which these systems are being integrated into government agencies. Often, many of these systems are procured from private contractors, who zealously protect various aspects crucial to understanding these ‘black box’ systems – including the data used, the logic inherent in the algorithms they rely upon, or other routine calculations that go into the making of automated-decision systems.

Most FOI laws around the world have exemptions for the protection of trade secrets or intellectual property, and governments and companies are routinely relying upon these exemptions to deny information and records requests. In the UK, for example, recent statistics have shown that four of the more important government departments only fully granted one in five FOI requests. Overall, of the total number of “resolvable” requests received by all departments during the survey period, almost the same number were rejected on the basis of an exemption or exception as were fully granted.

…the nature of digital governance infrastructures is often disaggregated, which can pose an obstacle to information requests

Moreover, the nature of digital governance infrastructures is often disaggregated – with different actors and agencies responsible for different elements of a technological system – which can pose an obstacle to information requests. Often, FOI requests are rejected because relevant documentation about an automated decision-making system was simply never produced by the system designers, or custody of the documents was not handed over to the agency responsible for using the technology.

Finally, FOI requests for information important to the audit and research of automated decision-making systems are, under various circumstances, denied on grounds relating to defence or national security, or on grounds of privacy (which may be implicated when interrogating the data on which algorithms have been trained or on which they operate). 

As automated decision-making encroaches upon the field of public policy and administrative decision-making, it is imperative to ensure that processes of public transparency like FOI are not debilitated. This requires policymakers, technologists and FOI activists to design mechanisms for ensuring that technologies integrated into public administration are able to generate and share appropriate public documentation which balances countervailing interests of confidentiality, security or privacy.

Public agencies should be impelled to substantiate grounds for refusal of FOI requests through available administrative and legal appellate mechanisms available. Careful and strategic litigation efforts, such as the one undertaken by ePantswo Foundation in Poland, in particular, could result in interpretations of FOI law to cover automated decisions and algorithmic systems and set important precedents to be followed by legal systems. 

Finally, there is a need for  deeper collaborations between the communities of activists working on FOI, and those working towards digital rights and responsible technology, particularly in order to tactically utilise existing FOI laws and documentation mechanisms in specific contexts in ways that can generate public transparency.

Divij Joshi is an independent lawyer and researcher exploring intersections of law, technology and society.