DFF is piloting an idea for a workshop series. We want to help digital rights litigators and activists to unpack and deconstruct some of the technologies that are prominent in their work.
Our first virtual workshop will take place on 7 December 2021.
Technology – what it does, how it is made, how it works, whom it serves, whom it harms – is central to digital rights litigation.
By answering these technical questions, lawyers can begin to fully comprehend the kinds of individuals who can bring a case relating to the technology, the type of legal arguments they would be entitled to make, who they can hold responsible for the impact of the technology, and the types of evidence that they would need to support their legal arguments.
In short, if digital rights advocates are able to make sense of the underlying technology in their cases, they will be in a stronger position to assess and challenge its harmful effects on society.
In short, if digital rights advocates are able to make sense of the underlying technology in their cases, they will be in a stronger position to assess and challenge its harmful effects on society
However, for many of us lawyers, making sense of technology isn’t always easy. We often can’t help feeling that coming to grips with it requires skills we don’t possess at best, and at worst, a mindset different from the one we were trained to employ. It it a prime example for the “known unknowns”: something we know exists, but that we don’t really understand.
The perception that lawyers and technologists have a different mindset is worth interrogating. Back in law school, the students in my year had a joke that the legal concept of iudex non calculat (the principle that a judge should base their decision on the quality of the advocate’s legal argument, rather than the number of arguments brought forward) should be translated as “lawyers can’t do maths”. If this were true, it would create quite an obstacle between technology lawyers and the technology they are seeking to regulate or whose regulation they are trying to enforce. But is it actually true?
The perception that lawyers and technologists have a different mindset is worth interrogating
It is undoubtedly the case that law and technology operate in different spaces, cultures and contexts. Where law is about the things that you should or shouldn’t do, technology or code are about creating environments or infrastructures where you either can or cannot do something. As Lessig highlighted in his seminal book “Code 2.0”, they are both rule systems, but they work in different ways and are based on different assumptions.
Most importantly, while legal rules provide for a certain amount of discretion depending on the specific facts of a case – allowing lawyers to argue about the scope of that discretion and the circumstances under which it should be exercised – code is largely binary. As such, it can reinforce legal rules, but it can also obstruct their application.
This means that technology can enable people, companies and government organisations to do things that we lawyers think they should not be able to do. Similarly, it can prevent all of those actors from doing things that we feel they have the right to do.
For example, if an algorithm makes a decision on the basis of a person’s race or religion in a manner that ultimately affects their ability to access credit, that is clearly discriminatory and thus violates a host of legal provisions put in place specifically to protect their human rights. If YouTube’s Content ID system takes down a person’s videos because its algorithm has identified part of the video as infringing somebody else’s copyright, that system may overlook the fact that their use of that content is covered by copyright exceptions aimed at protecting free expression.
…all technology lawyers know that, compared to the law, technology or code are relatively blunt instruments
So, all technology lawyers know that, compared to the law, technology or code are relatively blunt instruments, at least in their current form. Difficult decisions, like how resources are to be allocated, which data should be collected about individuals and what it should be used for, what can be said online and what should be deleted because it violates the rights of others or other legitimate aims, are often beyond the capability of technology and require human input or oversight. This is where strategic litigation comes in.
But in order to enable us to push for better, stronger and fairer oversight, we must first understand how the technology works. We cannot hide behind a perception that “we can’t do maths” or argue that understanding the technology is somebody else’s job. Rather we should perceive it as a language – one that we may not yet speak but are willing and able to learn.
We cannot hide behind a perception that “we can’t do maths” or argue that understanding the technology is somebody else’s job
To support our network in using the law to its best effect in cases that are related to the use of technology, on 7 December 2021, DFF will hold a half-day virtual workshop in what will hopefully turn out to be a series of events designed to “demystify” a range of different technologies.
The events will consist of a 2-hour training session, delivered, in each case, by an expert in the relevant technology, followed by a 2-hour hands-on session that will allow participants to try out what they’ve learned in a practical “sandpit” setting.
This first workshop will focus on automated content moderation. In addition to providing participants with useful skills and information, it will also serve as a proof-of concept for all future events.
Participants in this event must work for an organisation working on digital rights within Council of Europe jurisdictions and should ideally be involved in or looking to work on strategic litigation concerning online content moderation.
If you are interested in joining us for this experimental/pilot event, please send an email to firstname.lastname@example.org with “DFF Understanding Tech Workshop” in the subject line and an outline of how you meet the above criteria.
Because this is a pilot workshop, we also expect participants to contribute by providing us feedback on the structure of the workshop and what other technologies we should focus on for future workshops.
To be considered for the event, you must email us by 30 November 2021. To keep the workshop interactive and participatory, we have limited places available, so we will allocate those places on a first come, first served basis.
If you have any questions about the event or the feedback process, please feel free to contact us on the address above.
We hope to see you there in December!