Promoted as a technology to create a safer and more secure world, facial-recognition software, which is capable of identifying a person from a digital image or a video frame, is beginning to pop up at colleges across the country — but digital rights advocacy group Fight for the Future hopes to pressure schools to scrap the technology with a new campaign aimed at informing individuals of the potential harm facial recognition can inflict on students, the organization announced on Tuesday.
“We see facial recognition as sitting on a short list of technologies, like nuclear or biological weapons, where the harm that it poses, the threat that it poses, to human society and liberty broadly outweighs any of the potential benefits,” said Evan Greer, deputy director of Fight for the Future.
Step one of FFTF’s plan to stop the implementation of facial recognition of college campuses is to figure out exactly which institutions are using the technology, Greer said. The group says it will contact 40 major university administrations, including Stanford, Harvard, and Northwestern, to clarify whether they are using this technology.
Students are also being urged to sign petitions and introduce resolutions in their student government associations that call for campus-wide bans of facial recognition and increased transparency from administrators regarding campus surveillance.
Institutions that have implemented these high-tech surveillance systems say that facial recognition is an effective tool in keeping campus communities safe.
“But the reality is just that a lot of their claims are just simply not backed up by evidence,” Greer said.
Facial recognition on college campuses could provide school administrators, and potentially law enforcement and government agencies, not only with sensitive biometric data on students and faculty, but also with personal information of individuals’ behaviors and beliefs.
“I think if there is one thing that we have learned over the last decade, it’s that technology is profoundly powerful. It can democratize our society, it can give more people a voice than ever before, but it can also be used to trample our rights at a mass scale and violate people’s humanity in ways that we didn’t even know was possible a decade ago,” Greer said. “We see it as a real threat to academic freedom, to the ability to have robust and important academic inquiry and debate on our college campuses, and also something that actually puts students in danger.”
In addition to the potentially harmful effects that pervasive surveillance could have on students’ free speech and expression, Fight for the Future is also voicing concern over the way biometric data is secured.
“Information, once it’s collected, is vulnerable,” Greer said. “We’ve seen even the U.S. government unable to protect biometric databases,” she said, referencing a U.S. Customs and Border Protection data breach that compromised an unspecified number of images of travelers and license plates in June.
And college databases are just as, if not more, vulnerable.
“[Students] are relying on or trusting their administrations to protect this hugely sensitive data … and we just know that not only are college campuses not particularly good at collecting data, but neither are some of the large corporations that they rely on,” Greer said.
Additionally, the current technology being used in facial recognition systems can lead to the further discrimination of minority groups. In a study of some of the world’s top facial-recognition algorithms, the National Institute of Standards and Technology found that while the highest accuracy rates were generally seen in identifying middle-aged white men, Asian and African American people were misidentified as much as 100 times more often.
Greer said that she thinks the campaign against facial recognition will be successful in grabbing the attention of college administrators, get them to listen to the concerns of students, faculty and and civil rights groups and affect meaningful change.
“I think we’re at a pivotal moment as a society, and certainly as an academic community, where we need to make some serious decisions about what types of technology we allow and how we use it,” Greer said. “We need to approach these types of technologies, not just as a purchasing decision, but as an ethical decision, as a decision that impacts the fabric of our communities.”
This story was updated after publication to remove reference to a university that no longer runs a facial-recognition program.