The most controversial new tech tool for colleges since the start of the pandemic is automated proctoring, which aims to detect cheating on online exams by using algorithms that watch students via their webcam and look for suspicious patterns of behavior—often sending clips of questionable moments to professors for later review.
Just in the past few months, a law student sued an automated proctoring company, students have complained about their use in student newspaper editorials and professors have compared them to Big Brother.
Those complaints are on top of previous pushback that included petition campaigns that have drawn tens of thousands of student signatures against the approach, a statement by the University of Michigan at Dearborn that the institution would not use automated proctoring tools, and even a retreat by one proctoring company, ProctorU, which has decided not to sell software that uses algorithms to detect cheating—though it still sells services that employ remote human proctors to do the job.
Despite all that opposition and the fact that colleges are returning to in-person teaching, sales of proctoring software have been robust. A recent Educause study found that 63 percent of colleges and universities in the U.S. and Canada mention the use of remote proctoring on their websites.
And some analysts watching the tech space expect colleges to continue to sign up for the services to make them an option for professors to use.
“As far as I know business is holding up,” says Trace Urdan, a managing director at Tyton Partners, an investment banking and strategy consulting firm. “The story with a lot of edtech is that the pandemic catalyzed a lot of growth, and the adoption holds even once ground-based [teaching] goes back.”
One reason colleges are holding onto proctoring tools, Urdan adds, is that many colleges plan to expand their online course offerings even after campus activities return to normal. And the pandemic also saw rapid growth of another tech trend: students using websites to cheat on exams.
“There is a lot of concern in higher ed about Chegg and Course Hero,” Urdan says.
Officials for Chegg and Course Hero, for their part, argue that their services are not intended as cheating tools, and they point to acceptable use policies and other efforts that discourage cheating. But the companies’ marketing language promises struggling students easy answers, and many students say they have a reputation as cheating aids. Many professors, on the other hand, blame these companies for starting an arms race that created the market for automated proctoring in the first place.
Rethinking the Test
Those opposed to automated proctoring cite several objections.
Some say the systems often lead to false positives, add stress to the test-taking process and invade privacy. And darker skin tones can prove especially tricky for algorithms, raising equity concerns about the tech. Still, others have pointed out that savvy students can still find ways to get around the snooping software.
The controversy has led some professors to advocate for designing assignments that are harder for students to find answers online for—like project-based work. And others have worked to protect academic integrity without using proctoring tools.
Professors at the University of Maryland at Baltimore County presented one such idea at the recent Educause tech conference in Philadelphia.
They used a feature of the Blackboard learning management system to randomize questions for an exam in an introductory chemistry course.
“We randomly put students into four groups,” says Tara Carpenter, a lecturer at UMBC who taught the course. “We used settings in Blackboard to say group 1 is going to start with [questions in] group A,” she adds, noting that they had four groups of questions and that questions in each group were delivered in random order.
“We were trying to do everything we could so that if two students sat down together thinking they were going to take the exam at the same time, it wouldn’t help them at all,” she adds.
Despite all these efforts, a few students did use Chegg to cheat, posting questions from the test to the site and having a paid expert give an answer (the site guarantees answers in half an hour, according to Carpenter).
“After every exam, we were checking Chegg to see if anyone posted,” she says, and when they found a couple, they filed a request with Chegg to unmask the identity of the students who posted the questions. “Getting the info from Chegg requires a waiting period,” she adds. But she said they could often figure out who posted the questions simply by seeing which question was posted at that time. “We often found out who the cheater was before Chegg got back to us.”
Most of the students who used Chegg to cheat did so out of “desperation” because they were not passing the class going into the final, says Sarah Bass, another UMBC lecturer who helped develop the randomized chemistry exam. She stresses that most students are honest, but that the instructors still want to make the process as fair as possible.
Carpenter agrees. “There’s a mindset of some faculty who think that the default is that students want to cheat,” she says. “In reality, it’s a very small fraction of students who intend to cheat based on my experience.”
The professors originally tried to use remote proctoring software, adopting a system made by Respondus that monitors students and activity and lets instructors lock down the browsers of remote students so they can’t open other windows.
But they abandoned the approach when they discovered that many students could not use the software because it wasn’t compatible with Chromebooks. And some students complained about putting the software on their computers. “Students rightfully have their own concerns about having to download and use these software on their personal devices,” says Bass.
The professors decided it was worth the extra effort to avoid the proctoring software. “One of the things we’re pretty passionate about is equity for the students,” says Carpenter.
One question is whether other professors will make those efforts or choose the often easier answer of remote software.
At the University of Wisconsin at Madison, officials renewed their contract with an automated proctor provider, even after more than 2,000 people on campus signed a petition calling to ban the technology on campus. A university spokesman told the student newspaper that the number of professors using the tool has “drastically decreased” since the spring term.
Correction and clarification: An earlier version misstated a quote by Trace Urdan. He was referencing concerns about Chegg and Course Hero stated by others. He clarifies that while he acknowledging the concern, he believes the services “fill a market need for student support created by a corresponding inattention from institutions.”