Automated proctoring software: a threat to students’ privacy and IT security

By David Wedermannn, 15th June 2022

By David Werdermann and Victoria Stillig

Infection risks during the pandemic have caused universities around the world to offer remote testing opportunities instead of carrying out exams on campus. To do this, they had to find ways to guarantee equal examination conditions. At first glance, automated proctoring software that relies on artificial intelligence (AI) algorithms to indicate potential attempts of cheating in online exams seemed to offer a convenient and modern alternative. Unsurprisingly, a multitude of different software solutions has gained popularity in educational institutions around the globe. However, as many of these systems rely on the use of sensitive personal data that can usually only be processed under strict legal conditions (for example, under the GDPR), their use entails a high risk of violating students’ rights.

The criticisms on the use of self-learning algorithms are well-known. Since AI started to become commonplace in everyday life, it has continuously raised both enthusiasm and concern – automated proctoring is no exception. Enthusiasms over prospective efficiency gains in the numerous fields of application are accompanied by concerns over the non-transparency as well as their reliance on the extensive collection and subsequent processing of personal data.

At the same time, insufficient quality of data sets is one of the causes for the error-proneness of AI algorithms that may result in the reproduction of biases and perpetuation of discriminatory structures. This has already been acknowledged in several other areas, most notably in the detection and prosecution of crime where the United Nations High Commissioner for Human Rights (UNHCHR) has noted that decisions of AI systems based on faulty data can lead to grave human rights violations. For example, an individual can be falsely suspected of having committed a crime – or of committing one in the future. Because of the intensity of the infringement in question, the use of AI for profiling and forecasting in law enforcement, national security, criminal justice, and border management is still considered particularly questionable.

But these downsides of AI systems may potentially become manifest in any area in which they are implemented, especially when vulnerable groups are concerned. This holds true not only for highly sensitive security-relevant fields of application. The case of automated proctoring demonstrates perfectly how AI is used as a matter of course for seemingly mundane tasks that concern large parts of the population.

Functions of proctoring software

Numerous companies are providing automated proctoring applications, and they offer combinations of different functions. Which data is collected and how it is processed varies significantly depending on the software in use. What characterizes all remote automated proctoring software is the usage of AI technology to monitor students while they are taking the online exam to help the supervisor to detect cheating. Usually, the final decision over the existence of attempted cheating is left to the person in charge. However, the software is flagging individuals that the machine learning algorithm identifies as suspected of having attempted to cheat. The probability of attempted cheating is automatically calculated based on the captured behavior of the students.

For instance, the software WISEflow uses the Amazon Face Recognition-Algorithm for automated facial recognition to verify the identity of the student throughout the exam. In the beginning, a reference photo is being taken and biometric values are extracted. During the exam, further photos are being taken in irregular intervals and the extracted biometric values are compared to the values of the reference photo. The determined degree of resemblance serves as an indicator for the supervisor who can access the photos after the exam and use them as a basis for decision-making.

Another software, Proctorio, enables customized automated proctoring offering video and audio recording to conduct facial or gaze detection, screen, and web traffic recording as well as room recording and periodic desk scan. For Proctorio to collect this data, students must grant various permissions to the Bowser add-on. Thus, students are virtually obliged to install spyware on their personal devices. As an IT expertise by the Gesellschaft für Freiheitsrechte (GFF, Society for Civil Liberties) shows, this endangers students’ data security and privacy. And these risks are not purely hypothetical. In 2021, Sector 7, a group of ethical hackers found a critical vulnerability in Proctorio that could even be used by attackers to activate the students’ cameras.

Worldwide usage and responses

During the pandemic, automated proctoring has gained popularity in many universities but has provoked opposition at the same time. In the US, students of several universities have petitioned against different applications and expressed concerns about accessibility for students with disabilities and those who lack the necessary technology tools. Members of the US senate have addressed a letter to the three leading proctoring enterprises, in which they criticized that people of color and students with disabilities are subject to discriminatory biases and that student privacy is disregarded. Unequal access to technology tools, especially with respect to marginalized sections of society, also led the Supreme Court of India to determine a violation of the students’ constitutional right to equality.

In Europe, action taken before courts and Data Protection Authorities has led to a variety of different outcomes. The Amsterdam District Court has found the use of automated proctoring via the software Proctorio to be lawful based on the specific circumstances of the case while the Italian Data Protection Authority has considered the use of Respondus at the Luigi Bocconi University to violate the General Data Protection Regulation (GDPR).

In the United Kingdom, a proctoring software from US company Pearson VUE was used for the bar exam in 2020. Following criticism from candidates and submissions from the Open Knowledge Justice Program, the Bar Standards Board (BSB) published an independent review. The report recommends that a data protection impact assessment should be conducted for any subsequent remote proctored exam and that “the BSB should ensure candidates are aware of how their data will be processed and to ensure systems are GDPR compliant”.

In Germany, there is no uniform practice regarding remote exams but rather a wide range of solutions applied by universities. In some federal states, automated proctoring is prohibited or allowed only under strict conditions. Still, several universities decided to use automated proctoring software, thereby raising concerns from data protection authorities. To set a precedent against excessive surveillance, the GFF (Society for Civil Liberties) is planning strategic lawsuits against universities using proctoring software.

Legal problems and concerns

The use of automated proctoring software has a high potential to infringe human rights guaranteed by different international and national legal frameworks such as the right to privacy and the principle of non-discrimination. In the European Union, the GDPR regulates data protection and privacy. Processing of personal data is only lawful if and to the extent that one of the legal grounds listed in Art 6 (1) GDPR applies.

According to Art 6 (1) lit. (a) GDPR the data processing is lawful if the data subject has given consent to the processing of his or her personal data for one or more specific purposes. However, according to Art 7 (4) GDPR, consent must be given freely. Considering the recitals 42 and 43 of the GDPR, free consent cannot be assumed if the affected person must expect negative consequences in case he or she denies consent. Although every situation needs to be assessed individually, the existence of a relationship of subordination indicates that consent is not free. The Italian Data Protection Authority has assumed that such a relationship of subordination exists between students and universities. On this basis, Art 6 (1) lit. (a) GDPR cannot be invoked as a legal ground for the processing of personal data by proctoring software.

Instead, a legal ground could be found in Art 6 (1) lit (e) GDPR. According to this provision, the processing is only lawful if it is necessary for the performance of a task carried out in the public interest or the exercise of official authority vested in the controller. Additionally, the data procession must be adequate, relevant, and limited to what is necessary for the purposes of processing (Art 5 (1) lit (c)). Arguably, however, the use of automated proctoring software is not necessary since there are less invasive measures to prevent cheating in online exams such as open book exams or non-automated proctoring via regular video conferencing software.

Additionally, in the case of processing of special categories of personal data such as biometric data, the stricter requirements of Art 9 GDPR must be fulfilled. It depends on the specific software and modalities of use whether biometric data is processed. The Amsterdam District court did not apply Art 9 GDPR in the case because Proctorio did not process biometric data for facial recognition. However, regarding the facial recognition algorithm used by Respondus that relies on the processing of biometric data, the Italian Data Protection Authority found that the requirements of Art 9 GDPR must be met. Thus, the legal basis for the processing needs to be “proportionate and guarantee specific and adequate measures that protect the rights and interests of the individual”. Such a legal basis may be found in university or federal state regulations. However, while automated proctoring in the field of higher education clearly serves a public interest, it is not apparent why the public interest should be considered substantial as required by Art 9 (2) lit (g) GDPR. The existence of a substantial interest can, for example, be confirmed when the protection of public security is at stake. This can hardly be assumed in the case of a student who attempts to cheat in an exam. Even if the question of a legal basis is answered in the affirmative, the processing of special categories of personal data is at least not proportionate.

Finally, the use of proctoring software also raises questions about IT security. According to Art 5 (1) lit (f) GDPR, appropriate security of personal data must be ensured, including protection against unauthorized or unlawful processing. If students are required to install proctoring software on their private computers, then they necessarily also endanger the highly sensitive data stored on them. As , proctoring apps always carry the risk of external attackers exploiting security vulnerabilities.

Outlook: Online exams after the pandemic

Universities will increasingly rely on online examinations in the future, even after the pandemic. This also brings advantages. It enables the integration of digital technology and students can take part in the exams no matter where they are located. However, students’ fundamental rights should not be undermined in this process. Universities are therefore called upon to think about alternatives to examinations with proctoring. Take-home exams or open book exams are not only more privacy-friendly but often also more valuable from a didactic perspective. Seen in this light, the pandemic is also an opportunity to improve higher education.

 

David Werdermann works as a lawyer at the Gesellschaft für Freiheitsrechte (GFF, Society for Civil Liberties) on the topics of whistleblowing, surveillance and police. He coordinates a project funded by the Digital Freedom Fund on the use of proctoring software at universities.

Victoria Stillig studies law at the Humboldt University of Berlin. She was an intern at GFF in 2021.