Digital Rights are Charter Rights

Digital Rights are Charter Rights

DFF’s fourth Speaker Series explores concrete cases where the EU Charter of fundamental rights was utilised to protect digital rights at a national, cross-jurisdictional or EU level.

Example_V_120

Monday 18 Sept 2023, 17:00 – 18:20 CEST

In this session, we discussed how digital technologies are developed using race as a factor to systematically discriminate and oppress individuals and/or populations. We highlighted collective harm and structural inequality suffered as a component that unevenly distributes vulnerability and explored how the EU Charter  has been used to challenge these practices.

The conversation was guided by the following questions: 

1. What are the types of examples that showcase the discrimination and oppression suffered by racialized individuals or groups in the use of digital technologies?

2. How does the State perpetuate these practices?

3. What are the challenges that racialized people face in countering the discriminatory effects ensuing the use of digital technologies?

4. What is the impact on a collective level for racialized populations? What are the legal, judicial, and other challenges in demonstrating, preventing, or addressing systemic harm in these cases?

The conversation was moderated by Alexandra Giannopoulou, digiRISE project coordinator at DFF, and includes an audience Q&A.

Read more
Dr. Nawal Mustafa

Legal officer at Public Interest Litigation Project (PILP)

Jill Toh

Ph.D. student at Institute for Information Law (IViR) and co-founder of the Racism and Technology Center 

Wednesday 31 May 2023, 17:00 – 18:20 CEST

In this session, we explored how the EU Charter right to non-discrimination can be (and has been) used to fight back against discriminatory e-proctoring systems.

The conversation was guided by the following questions: 

1.  How do e-proctoring systems work and how do they reinforce, exacerbate, and automate discriminatory practices?

2. Which are the discriminatory elements in such algorithmic systems and why is it difficult to provide proof of the discrimination?

3. What are the challenges that AI systems pose to existing discrimination laws, judicial and other legal recourse processes, as well as independent actors such as National Human Rights Institutes?

  • 4. What is the responsibility of the actors, in this case the university, taking the decision to implement AI systems? 
The conversation was moderated by Alexandra Giannopoulou, digiRISE project manager at DFF, and includes an audience Q&A.
 
 
Read more
Picture of Naomi Appleman
Naomi Appleman

Ph.D. researcher at the University of Amsterdam and the co-founder of the Racism and Technology Center.

Picture of Robin Pocornie
Robin Pocornie

Master‘s student in Bioinformatics at VU Amsterdam.