Advocacy groups urge school administrators to ban eproctoring
A coalition of 19 advocacy groups are urging school administrators to ban the use of “eproctoring” apps over concerns that the systems are invasive and can be harmful for students.
Software that uses AI-powered systems to monitor students as they take tests, often through required webcam recordings and facial recognition technology, is known as eproctoring.
In an open letter published Thursday, the groups liken the tracking software to “spyware” and argue it raises significant issues that perpetuate racism and ableism while failing to serve its purpose to prevent academic dishonesty.
“They also treat students as if they are guilty until proven innocent, which is a disrespectful and harmful stance for any academic institution to take,” the letter reads.
The 19 signatories include human rights and youth advocacy organizations including Fight For the Future, AccessNow, Encode Justice, Parents Together and Media Alliance.
The open letter follows a scorecard Fight For the Future launched in June detailing how prominent colleges and universities plan to use eproctoring in the upcoming school year. According to Fight for the Future’s scorecard, most of the schools asked said they are or “might be” using such software systems.
The letter urges higher education as well as K-12 school administrators to stop using the programs.
“Eproctoring technology is clearly not ready for prime time. Based on complaints FairTest has received from test-takers and teachers, the software often mislabels totally normal behaviors as ‘cheating.’ The false positive warnings appear to be more common for African American and LatinX students,” Bob Schaeffer, executive director at National Center for Fair & Open Testing (FairTest), said in a statement. “From both the equity and education quality perspectives, eproctoring products should not be used for administering either K-12 or college exams.”
Activists and students have long spoken out about issues with eproctoring, especially due to facial recognition software failing to detect nonwhite students.
Concerns about such programs were amplified over the last year with more students shifting to virtual schooling during the coronavirus pandemic.
The advocacy groups also write that the software is harmful for students with testing anxiety and students with disabilities.