Fighting for Algorithmic Transparency in Spain
Civio is an independent, non-profit journalism organisation based in Madrid. We monitor public authorities, report to citizens, and promote true and effective transparency within public institutions.
After years of fighting for access to information and reporting to a general audience through our journalism, we have recently reached a new point of consensus. We have realized that requesting, accessing and explaining data to a general audience is no longer enough in order to oversee the public realm.
The use of secret algorithms by Spanish public institutions made us take a step further. One example is BOSCO, a software created by the Spanish Ministry for Green Energy Transition to decide who is entitled to the so-called Bono Social de Electricidad – a discount on energy bills to at-risk citizens.
At first, Civio teamed up with the National Commission on Markets and Competition (CNMC) to create an app in order to ease the process of applying for this discount, since the complexity of the process and lack of information were preventing disadvantaged groups from applying.
Fighting for Transparency
After dozens of calls from wrongly dismissed applicants, our team requested information about BOSCO and its source code. In the documents shared with us, Civio found out that the software was turning down eligible applications. Unfortunately, we got no reply from the administration regarding BOSCO’s code.
The government, and the Council of Transparency and Good Governance, subsequently denied Civio access to the code by arguing that sharing it would result in a copyright violation. However, according to the Spanish Transparency Law and intellectual property laws, work carried out by public administrations are not entitled to any copyright protections.
“Being ruled through a secret source code or algorithm should never be allowed in a democratic country under the rule of law”
Civio believes it is necessary to make this kind of code public for a simple reason – as our lawyer and trustee Javier de la Cueva puts it, “being ruled through a secret source code or algorithm should never be allowed in a democratic country under the rule of law”.
Software systems like BOSCO behave, in practice, like laws, and therefore we believe they should be public. Only then can civil society have the tools to effectively monitor decisions taken by our governments. This is the reason why we have appealed the refusal of the Transparency Council in court.
This is our first incursion into algorithmic transparency, and we are fully aware that it is going to take a long time. We believe BOSCO is a good case to start with because it is not as complex as other machine learning ‘black box’ systems, whose biases may be very difficult to unpick and identify. This first battle will unveil the limits to the Spanish Access to Information Law and will allow us to prepare for more complex cases.
In order to get ready for the future, it is very useful for us to know about cases others have fought in the past, as well as in other contexts. While we were aware of ProPublica’s investigation on biased prediction-software in US courts, we also discovered many other instances at the Digital Freedom Fund and AI Now workshop in Berlin last year. This made it very clear to us that the problem of algorithmic accountability is vast in scope. It was particularly valuable to learn about cases in Europe, where the legal framework is closer to Spain with regard to privacy and data protection.
Fighting Obscure Algorithms
Given the relevance of the issue for the well-being of civil society, our goal is to start developing the technical, organisational and legal skills needed to assess and control the increasing use of automated systems in public administrations.
In the words of Maciej Cegłowski: “machine learning is like money laundering for bias. It’s a clean, mathematical apparatus that gives the status quo the aura of logical inevitability. The numbers don’t lie.”
…it is through algorithms that many decisions concerning the public are being taken
Algorithms, therefore, can be used to obscure dramatic changes in policy adopted by administrations or the political class, including for example public policies that implement dramatic cuts in public access to welfare services. So even while we continue to fight for transparency and our right to information, we should not ignore that it is through algorithms that many decisions concerning the public are being taken.