New York school district pauses on facial recognition system

Getty Images

Share

A school district in upstate New York that had planned on Monday to begin operating a recently-installed facial recognition system will now postpone testing at the request of state officials concerned with student privacy.

Lockport City School District, a K-12 district of eight schools with about 4,400 students, announced on its website last week that it had finished installing video cameras backed by facial recognition software called Aegis, which is sold by a Canadian firm called SN Technologies.

The school cited dismay at the school shootings reported in recent years and a need for “preventative measures.” The school spent $3.8 million of a $4.2 million Smart Schools Bond Act state grant on security upgrades, which includes the facial recognition system, along with physical security enhancements to building entrances, such as bullet-proof greeter windows, panic buttons, a new visitor badging system and computer equipment. The district also hired new counselors, social workers and behavior intervention specialists.

Though the district claims on its website that the facial recognition system — one of the first deployed in a K-12 school district in the United States — was approved by the New York State Education Department in November 2017, the department is now investigating the district’s privacy policies. Meanwhile, the district has agreed not to use the facial recognition component of the system, following a recommendation from the education department.

In an email to EdScoop, the education department said it has not concluded the district has the necessary framework in place to protect student privacy or to properly secure any data collected. The department says it’s now finalizing privacy and security regulations the district would be able to adopt.

According to the district’s website, the facial recognition system works by comparing faces or objects found in video footage with objects like guns or people in a database of unwelcome visitors. Those people include level 2 or level 3 sex offenders, students who are currently suspended from school, anyone previously barred from school property, or anyone otherwise deemed a threat to school safety “based on credible information,” according to the district.

The district says when a match is generated by the system, it’s forwarded to an administrator, who makes the final call of how the school should respond. If the system detects a gun, police are notified automatically and the school is placed in lockdown.

“Early detection of a threat to our schools allows for a quicker and more effective response,” the district’s website reads.

The district says it’s confident in its privacy policies, noting on its website that the Aegis system complies “with all applicable privacy laws.”

“The information databases are secured and accessible by trained administrators with security privileges. All management of alerts is maintained by trained District employees. The database is periodically audited and updated to ensure its accuracy,” the district website reads.

But local activists and the New York Civil Liberties Union have remained concerned with the technology’s use in schools since the district held a single meeting last August to inform the public of its planned security upgrades. The district’s sparse public engagement elicited censure from the civil liberties group, which said the district had “barreled ahead” with the “invasive” technology without first engaging community stakeholders in a more systematic way and exploring the broad concerns many researchers have with the technology’s application in public spaces.

Jim Shultz, a local resident and father of one of the district’s students, started a petition last year that was signed by about 100 other residents, calling for the district to answer detailed questions about the security and privacy implications of the system before proceeding with installation. The New York Civil Liberties Union sent a letter last summer to the district and the state’s education department making a similar request for more information on what policies would govern the technology’s use.

“Schools should be safe places for students to learn, not spaces where they are constantly surveilled,” the NYCLU wrote.

The group raised concerns that the information might be shared with law enforcement agencies or immigration authorities and pointed out that facial recognition systems are notoriously inaccurate when it comes to identifying women, children or non-whites.

In addition to claims that the district abdicated its public-notice responsibilities as delineated by the Smart Schools Bond Act, the NYCLU also claimed the district’s decision to move ahead with the facial recognition system followed encouragement from a consultant — Tony Olivo with Corporate Screening and Investigative Group — whose company, the NYCLU says, holds a licensing agreement for the Aegis software installed at the schools. CSI Group did not respond to request for comment.

The NYCLU’s lengthy demand list also calls for the district to provide its staff with training on how to use the system, particularly in cases when the software makes a false identification. For now, privacy exponents appear placated, but the conflict may resume if the district is allowed to use its facial recognition system after adopting the state’s recommended privacy framework.

“We are glad to see that NYSED is reining in the district, but it is crystal clear that the state must step in and ensure that inaccurate, biased, and potentially dangerous technology is not imposed on students, teachers, and parents without due consideration of its effects,” NYCLU education counsel Stefanie Coyle said in a statement. “This technology does not belong in schools.”

TwitterFacebookLinkedInRedditGmail