Eye-tracking, facial recognition, video and audio surveillance — a new report on remote test-taking warns of abuse, bias, and privacy infringement of students taking exams at home.
The report, from the Surveillance Technology Oversight Project (STOP), a New York-based nonprofit advocacy group that works to end discriminatory surveillance, looks into increasingly expansive access tech platforms have to students’ personal data, and calls on educational institutions to stop using online monitors.
Remote proctor programs are currently used at three-quarters of universities and K-12 schools in the United States. “With the transition to remote education, there has been this really rapid expanse of remote proctoring services that bring some of the worst forms of surveillance abuses into children’s bedrooms,” said Albert Fox, executive director of STOP. “And it’s really disturbing just how rapidly this technology is normalizing types of automated surveillance that would have once been unthinkable.”
Covid-19 school closures have affected over 850 million youth, half of the world’s student population, leaving students more reliant on education apps than ever before. More than 90% of countries have implemented some form of at-home learning policy, according to UNICEF. In order to administer exams remotely and prevent cheating, many education institutions have relied on live online proctors in tandem with anti-cheating software.
Software like ProctorU, Proctorio, ExamSoft, Examity, and Verificient each utilize a combination of eye-tracking, facial recognition, video and audio surveillance, and remote access to students’ computers to prevent academic dishonesty.
Remote access — used to monitor if a test-taker copies and pastes or opens a new tab — is common to many online exam platforms. Though most of the software requires consent to operate the student’s desktop, Fox says it’s not always clear to users that the permissions also grant access to local files, including private folders and photos. Proctors can also remotely track a student’s keystrokes and mouse clicks, leaving more data vulnerable to exploitation by third parties.
“It’s essentially allowing this school-mandated spyware to have complete control over your local computer,” said Fox.
The report, “Snooping Where We Sleep: The Invasiveness and Bias of Remote Proctoring,” points out that privacy policies often are opaque, and do not clearly state how long personal identifiable information will be held. According to ExamSoft’s privacy policy: “ExamSoft retains personal data so long as we have an ongoing legitimate business need to retain it.” The company, which collects biometric data to “detect irregular behavior,” reserves the right to sell user data for marketing purposes.
Remote exam proctor software companies maintain they have safeguards in place to prevent misuse of information while ensuring the integrity of the remote tests. ExamSoft, which serves approximately 2,000 programs across 33 countries, said in a statement that read in part that the company “takes privacy very seriously. We are trusted with delivering exams of all types and sizes, and we are humbled by that responsibility. Part of our commitment to all users – both on the faculty/exam-maker and student/exam-taker side – is to handle their information and data with care.”
In addition to privacy concerns, the report highlights the dangers of bias along race, disability, and class lines.
ProctorU uses artificial intelligence-enabled facial recognition — which has been found to be less accurate at reading images of dark-skinned students — to match a student’s face to the one on their ID card, creating an unfair obstacle to identity verification, the first step in taking an online test, according to the report.
Scott McFarland, ProctorU’s chief executive, responded that privacy is a priority at the company. “During Covid-19 especially, cheating and test misconduct has increased dramatically and represents a substantial threat to academia overall and specifically to online study,” he wrote in response to the allegations in the report. “There are numerous and recent independent academic studies that confirm this. And while we absolutely understand the unease that comes with using technology for the first time, and the stress that testing and remote schooling already carries, ensuring a fair and safe test environment is essential to fundamental equity of and access to education.”
The report highlights how video surveillance discriminates against students with disabilities. Involuntary eye or body movements, whether due to a physical impairment or test anxiety, are grounds for cheating.
For low-income students, especially those who live in crowded spaces or who act as a caretaker for a family member, the stakes are even higher. In some cases, background noises are considered an immediate indication of academic dishonesty.
“We’re effectively penalizing students for poverty,” Fox said. “If you’re a student who shares a smaller apartment with relatives, who doesn’t have a private bedroom in which to take this exam, simply having your family in the vicinity — simply sharing a home with them — could be something that gets marked as suspicious activity.”
Such activity is then flagged for review by a proctor or school administrator.
Typically, live remote proctors can see the test-taker but the student is unable to see the proctor. This power imbalance can create a feeling of being spied on in their home, which students have posted about on TikTok.
“There is a sense of blurred boundaries,” said Leah Plunkett, a faculty associate of Youth and Media at the Berkman Klein Center for Internet & Society at Harvard University, and a professor at the University of New Hampshire School of Law. “It’s just adding one more additional set of eyes to what has traditionally been a protected domestic space.”