Online video proctoring software raises ethical concerns that instructors can avoid by closely examining how they design online assessments, experts said during an online Educause panel on Wednesday.
Software using artificial intelligence to scan video for cheating behavior was used as a quick way to ensure academic integrity during the coronavirus pandemic, but thousands of students using the software highlighted some problems like potential bias or threats to student privacy. As online and hybrid learning continues, academic leaders and instructors can incorporate appropriate use of video proctoring into a broader discussion on designing online assessments, panelists said during the event.
At the University of Utah, instructors felt compelled to monitor students taking tests online during the pandemic because it’s what they did in the classroom, Associate Vice President Deborah Keyek-Franssen said. Academic leaders at her institution are now working closely with faculty, she said, to determine if students need to take the test all at the same time and whether an assessment needs to be timed and monitored, as if it’s being used as preparation for a professional exam setting.
“That [conversation is] really hard because it gets into course design and assessment design but when you do that, it also addresses some of the privacy and the bias concerns because you’re not turning the cameras on,” Keyek-Franssen said.
Opening up that dialogue can not only help navigate ethical issues associated with video proctoring, University of Miami Chief Academic Technology Officer Allan Gyorke said, but can support a better testing approach generally.
“One of the things that I recommend is you always should consider all authentic assessment first, as opposed to any kind of high-stakes exam because if I’m writing a paper or doing a presentation, I’m going to learn through that experience,” he said. “But if I’m just taking an exam or I’m just giving short answer, that’s just a recall check. It’s better to have students focus on something that they actually learn from experience.”
Proctoring tools can treat unfairly students without reliable access to a device, internet or a quiet testing space, Gyorke said. If a proctored exam requires video, but a student can only access Wi-Fi from their car outside of a building, the software can’t account for that setting, he said during the panel.
“[Artificial intelligence] can’t, on the fly, give an accommodation, or they can’t say, ‘I can see that your laptop’s having some problems, why don’t we adjust it in this way,'” Gyorke said. “There’s no mercy. You give them a set of rules, and they will obey those rules efficiently.”
David Thomas, executive director of online programs for the University of Colorado, said higher education institutions need to offer more support and resources for faculty and staff so they have the time to carefully consider how they are testing students online. That could include creating working groups for online learning or employing teachers’ assistants to free up time, he said.
“We keep saying, hey, we’re administrators, we’re going to buy you this software and that’s how we’re going to solve the problem,” Thomas said. “That’s not the solution. The solution is to bring other resources in addition to technology to the table.”
Colleges and universities are now reassessing the tools they’re using to monitor cheating in online courses as some students return to in-person or hybrid classes. The University of Wisconsin-Madison renewed a contract with the software company Honorlock for another year, the Wisconsin State Journal reported this month. A university spokesperson told the Journal that despite criticism, the software offers features that can be helpful in online and in-person settings. Ohio University, meanwhile, is partnering with its Center for Consumer Affairs to gauge faculty and student attitudes toward anti-plagarism software for written content, hoping to refresh its recommended software portfolio.