Data privacy advocates wary of school surveillance technology shortcomings
Student data privacy experts this week urged school leaders to be wary of edtech companies using “deceptive marketing practices” while pitching surveillance technologies, particularly those that promise to improve school safety and student well-being.
With growing concerns around the rise of school shootings and student suicide, many schools have turned to technology in search of solutions. However, Chad Marlow, a senior policy counsel with the American Civil Liberties Union focusing on privacy, surveillance and technology, said to not let fear drive decision making.
“As hard as it is to set aside your fear, do that,” Marlow said while speaking at the South by Southwest EDU conference in Austin, Texas, on Monday. “Rely on data, proven facts. The sort of things that schools teach their kids to do when they’re doing research and writing reports, you should do yourself.”
Marlow admitted that he finds it difficult to not be impacted by edtech surveillance marketing tactics that drum up a narrative that school shootings are high-risk for schools. Although the number of school shootings in the United States hit an all-time high in November of 2023 with 306, risk communication consultant David Ropeik found that the likelihood of a K-12 student being shot and killed at school was one in about 614 million.
“That is twice as unlikely as winning the Powerball or Mega Million jackpot,” Marlow said. “I understand the fear, it is a worst case scenario. But the marketing that suggests that your school is a moment away from one of these things happening is not exactly contextualized in the most accurate way.”
Marlow said that schools are usually opting for these technologies with pure intentions, but the unintended consequences of deploying them in schools can harm marginalized groups. Content monitoring and filtering technologies can lack nuance and miss the context of what information students are searching for while using school-issued devices.
“When we have these situations like a kid who was searching for information about LGBTQ+ gets flagged and is sent to the principal’s office, we should look at that, not as an example of justice but as an example of technology working badly,” said Meredith Broussard, a data journalist and research director at the New York University Alliance for Public Interest Technology.
Marlow said students with disabilities can be flagged for aggressive behavior through surveillance cameras and aggression detector technology that is designed to flag behaviors that are anomalous or out of the ordinary.
“What that means is that completely normal behaviors for a disabled or neurodivergent student are likely to be flagged as problematic,” Marlow said. “For example, a student with ADHD may look around or fidget more during remote learning or test taking, but that doesn’t mean that they aren’t paying attention or that they’re cheating, but these programs are going to flag them for doing that.”
Many speakers at the conference argued that human problems require human solutions — not technology. Marlow said that school shootings, bombings and student suicides have been prevented by concerned students reporting warning signs to school officials. Amelia Vance, president of the Public Interest Privacy Center, agreed that building trust between students and adults in schools is the best line of defense.
“Honestly, school administrators need to invest more in real solutions, instead of apps, like hiring a mental health therapist who’s actually licensed,” said Shreya Sampath, a freshman at George Washington University and chapter projects director at Encode Justice.
Marlow urged school leaders to be wary of edtech companies using “deceptive marketing practices” while pitching surveillance technologies, particularly those that promise to improve school safety and student well-being.