Schools are using facial recognition to try to stop shootings. Here’s why they should think twice.

Schools are using facial recognition to try to stop shootings. Here’s why they should think twice.

Christina Animashaun/Vox

Facial recognition is just one of several AI-powered security tools showing up at schools.

Open Sourced logo

For years, the Denver public school system worked with Video Insight, a Houston-based video management software company that centralized the storage of video footage used across its campuses. So when Panasonic acquired Video Insight, school officials simply transferred the job of updating and expanding their security system to the Japanese electronics giant. That meant new digital HD cameras and access to more powerful analytics software, including Panasonic’s facial recognition, a tool the public school system’s safety department is now exploring.

Denver, where some activists are pushing for a ban on government use of facial recognition, is not alone. Mass shootings have put school administrators across the country on edge, and they’re understandably looking at anything that might prevent another tragedy.

Safety concerns have led some schools to consider artificial intelligence-enabled tools, including facial recognition software; AI that can scan video feeds for signs of brandished weapons; even analytics tools that warn when there’s been suspicious movement in a usually-empty hallway. Recode has identified about 20 companies that have sold or have expressed interest in selling such technology to educational institutions.

On its face, facial recognition seems like it might help keep kids safe; in a promotional video by Panasonic, a Denver public school official argues that the company’s AI could be used to prevent potentially dangerous people — like students expelled because they brought a weapon to school or someone barred by a restraining order — from entering a school campus (though the public system has not yet implemented the tool). Most schools appear to be thinking about facial recognition as a way to regulate entry onto a campus, creating databases of people who have previously been flagged.

But facial recognition and similar software have also been suggested for more routine tasks at school, like taking attendance and investigating code of conduct violations. And critics add that it’s not apparent that this software works as advertised, and, with relatively few trials in schools, there’s no real guarantee it will actually make students safer.

 RJ Sangosti/The Denver Post via Getty Images
Students at Prairie View High School, in Henderson, Colorado, crowd the hallways between classes on January 15, 2014.

High-tech security software could make students feel policed and surveilled, and research has already demonstrated that facial recognition can be inaccurate, especially for people of color and women, as well as other groups. (Those findings were confirmed by a National Institute of Standards and Technology report released Thursday.) Meanwhile, legislation explicitly regulating the use of these tools remains scant, and some critics worry that the sensitive data that facial recognition systems create could ultimately be shared with law enforcement, or a federal agency such as Immigration and Customs Enforcement (ICE).

“Facial recognition is biased, broken, and it gets it wrong. It’s going to put a lot of students in danger, especially students of color,” warns Albert Fox Cahn, the executive director and founder of a legal nonprofit called the Surveillance Technology Oversight Project. “We know that this technology will get it wrong quite a bit, and we also have no evidence to show that it has any public safety benefit whatsoever, especially in the grandiose scenarios that proponents put forward.”

Companies market facial recognition as a safety tool

Here’s how the tech could work in a school setting. Facial recognition technology compares images or videos of people entering or within a school building, with a database of already-known individuals and near-instantly confirms their identity, usually for the purpose of alerting security staff or automatically admitting someone into an area.

That database could include a school’s current staff and parents who have been approved to enter a school; it might also include particular individuals a school does not want on its premises, such as expelled students, former employees, registered sex offenders (or those listed on other court-administered databases), or other people school officials might decide to deem suspicious (and have an image of).

Which means that to make use of these tools to preemptively stop a violent event, school staff would have to already know that a person was potentially dangerous and unwelcome on campus — and flag them in the system.

It’s important to note that school shooters are often not previously banned by school staff. Systems like these, though, could theoretically allow a school official to flag a student for any reason, or no reason at all (and regulation of these tools isn’t clear, more on that later).

Compared to expelled students or other people deemed threats, schools appear to be more apprehensive about entering sensitive data about the entire student body into a facial recognition database. That’s part of why, when Lockport School District in New York announced that it would install a facial recognition tool, the state’s department of education called for the plan to be put on hold. The school district and state education officials have since been going back-and-forth over rules for implementing the system to ensure the database will only be used on flagged, non-students, and not students themselves.

Ultimately, we don’t know how many schools have made use of a facial recognition-based tool (Wired found eight public school systems), but it’s not clear that any of the systems deployed at schools thus far have yet stopped a violent event. Mike Vance, a senior director of product management at RealNetworks — which currently provides schools recognition software for free or at a discounted rate — said he’s aware of one school that set up an alert on a person who had expressed general plans for a school shooting in the surrounding area (that person didn’t show).

Another school, Vance said, set up an alert on a student who the school administrators had reason to believe would threaten the school’s safety (that student didn’t show, either). He added that, generally, RealNetworks’ system is not used for recognizing students.

While Vance says that RealNetworks directs schools to its guidelines for best practices, he emphasizes that the company can’t control, and can’t access or see, how schools are actually using the tool.

Importantly, there’s no one type of company that might bring facial recognition to a school. While it’s possible the technology could require installing new, higher-quality surveillance cameras, some can function as a software extension of a school’s existing security infrastructure. Some companies may produce facial recognition software themselves, but not always. For instance, the Oklahoma City-based security firm TriCorps has deployed a Panasonic facial recognition tool, called FacePro, at a school in Missouri. Panasonic also appears to offer facial recognition to schools directly, as in the case of Denver Public Schools. (Panasonic did not respond to a request for comment.)

“Facial recognition is becoming more broadly available and often as a new function in already established CCTV/[s]ecurity products,” explained Daniel Schwarz, a privacy and technology strategist at the New York Civil Liberties Union in an email to Recode. “School districts could unwittingly purchase face surveillance tools without even knowing about it.”

He said the NYCLU had spoken to at least one school district that bought a biometric tool but “wasn’t aware of the functionality.”

Beware of “mission creep”

Facial recognition could do more than notify officials when people suspected to be dangerous enter schools. For school officials, that might seem like more bang for their buck, but critics worry that excessive use of the tool could turn into surveillance of students. “We don’t have a single example of a costly and invasive surveillance tool that’s deployed that’s only used for the thing we’re told it will be,” Cahn said.

Mike Vance, RealNetworks’s product management senior director, says that schools are using facial recognition to preemptively enforce child custody agreements. He gave examples of schools that have set alerts in their facial recognition systems on birth parents who have been barred by court order — or other legal processes — to make contact with a child. (He’s not aware of any cases in which a school has caught a parent in this way.) Wired reported that a facial recognition system was even used to check whether a student believed to have run away from home had shown up at school.

There’s particular worry that facial recognition tools could be used to police and investigate student behavior. The superintendent of one New York school district considering the technology floated the idea of using it to enforce codes of conduct, according to the Buffalo News. That’s concerning to critics who point out that facial recognition can be especially inaccurate when applied to people of color, and women with darker skin, in particular (you can read more about bias in facial recognition here, here, and here) and could worsen the school-to-prison pipeline.

“[F]acial recognition technology will necessarily mean Black and brown students, who are already more likely to be punished for perceived misbehavior, are more commonly misidentified, reinforcing the criminalization of Black and brown people,” wrote NYCLU organizer Toni-Smith Thompson last year. “That will happen even as facial recognition algorithms get better at correctly recognizing people’s faces.”

 Joshua Lott for the Washington Post
Students walk through the hallway after classes were dismissed at Senn High School in Chicago on May 10, 2017.

Facial recognition is already being used to take attendance, an application that would presumably require a database of identifiable information on every student at a school. In the US, at least one company, Face Six, sells attendance-taking facial recognition to educational institutions. The technology is in about two dozen educational institutions (including both in the US and elsewhere), a number its CEO says reflects a “mix” of private and public schools, as well as universities.

Facial recognition-as-attendance is also popping up abroad, including in China (though its use there may be curbed). Orahi, a startup that works in India, appears to be using Amazon’s controversial facial recognition tool Rekognition to automatically take the role on school buses and in schools. (Amazon did not respond to a request as to whether it’s sold Rekognition to other startups that work with students or at schools.) In Sweden earlier this year, a municipality was fined after a local school tested using facial recognition to track student attendance, in violation of the General Data Protection Regulation (GDPR); a similar tool in Australia also sparked backlash.

Facial recognition isn’t the only AI-based security tool schools are using

Another increasingly popular application of AI is weapon detection. The idea is to use AI to understand the image of a weapon (like a handgun or an assault rifle), and then alert school staff anytime a corresponding item is recognized in a security video feed. “At a very simple level, we are going out and sourcing images and videos of guns [and] of guns being pulled in a variety of scenarios, [and] different types of weapons, like knives, guns, and rifles. And we’re just collecting as many data points as we can about what a gun looks like or what a weapon looks,” explains Trueface CEO Shaun Moore.

Moore says he isn’t aware of a violent event that his software has stopped yet, though he emphasizes that it’s early. But the technology is growing more widespread. Another company, Actuate, says its system is in use at “almost a dozen private schools and school districts.” ZeroEyes, another gun-detection service, says its tool is being used at eight locations and is closing contracts with 30 more, most of which are schools.

“The way to think about how this type of AI works is that it can recognize the shape of a gun in the same way that a human can, but it can’t understand the context,” explained Actuate’s chief product officer and cofounder, Ben Ziomek, in an email. “If an object looks like a weapon to a human in a few frames, our system will mark it as a weapon.” The system could theoretically flag prop firearms used for a school play, or certain replica toy weapons.

But while the technology is sometimes sold in conjunction with facial recognition, it still comes with risks. SN Technologies, which is offering weapons detection in addition to facial recognition to Lockport public schools, said that during one test its system falsely flagged a walkie-talkie pointed like a handgun. Several of the companies admit that their systems could produce false positives — while also claiming high accuracy rates — and emphasize that school security staff are responsible for checking that the software has flagged a real weapon.

“We just want to help that person make that decision faster,” Moore said. “It’s very difficult to monitor that many camera feeds in real time.”

“[W]e’ve had incidents where students were brandishing mock weapons used for a school play,” explained Ziomek. “An off-site team would have called law enforcement because the weapon looked completely real, but the security staff on-site knew the context of the situation and gave the students a firm talking-to rather than calling the police.”

But critics say these systems shouldn’t necessarily be trusted. “When you add on all of the visual noise of being in a school with hundreds of people in moving around — and all these things in motion — and no static background, there are a lot of different everyday objects that will end up setting it off,” Cahn said.

There are also other, AI-based tools that schools can purchase, like a “self-learning analytics” feature sold by the Canadian security-technology firm Avigilon. The company explains that its AI studies video feeds collected by cameras throughout a school and learns normal patterns of traffic. That means it can flag unusual activity — like a lot of motion in a hallway at a time when it’s normally deserted.

The company also sells an “appearance search” feature, which is on track to be used by Florida’s Broward County School District, including Marjory Stoneman Douglas High School (it’s already in other schools). For instance, with a school safety official could observe a video of a person that appears in a classroom at 2 pm, and search to see where else that person has appeared on a school’s video feed, based on characteristics of their face, their clothing, and gender, among other factors.

The US has been slow to regulate facial recognition systems

Facial recognition requires creating databases of sensitive and personally-identifiable data — immutable information about our faces — that we may not want schools to possess. For one thing, the Surveillance Technology Oversight Project’s Cahn is doubtful school officials are prepared to keep such information secure and protected from hackers.

But, like other critics, he’s also worried about whether these systems will be used to target undocumented students and students of color. “Many school districts have a history of working hand-in-hand with law enforcement to create the school-to-prison pipeline, so we certainly can’t trust that schools will push back against a request from law enforcement,” Cahn said. “But even if these schools were to oppose [law enforcement], they simply don’t have a legal mechanism to block the government from getting a court order to obtain this data.”

According to the agency’s website, ICE considers schools “sensitive locations,” meaning that they’re not supposed to be targeted for enforcement activities unless officers are led to the location by other “law enforcement actions,” there are “exigent circumstances,” or prior approval is obtained from a “designated supervisory official.”

It’s worth noting that some schools already have agreements with police departments to share access to their live video feeds. For instance, the Suffolk Police Department in Long Island, New York, operates a program called “Sharing to Help Access Remote Entry (SHARE) where officers can remotely access school security video feeds. The system is meant to be used in an emergency (like an active shooter situation), and already uses a license plate identification system that can also determine the make and color of a car that’s parked on campus.

The county’s police chief, Stuart Cameron, told Recode that the department is exploring facial recognition technology (like it would explore any tool). If it does choose to adopt facial recognition, he says there’s no reason why the department wouldn’t use the tool in the conjunction with the SHARE program.

Meanwhile, laws that clearly apply to these tools are few and far between. Federal regulation governing facial recognition nationally doesn’t exist yet, though it’s possible existing privacy or education laws — like the Family Educational Rights and Privacy Act — could be applied to certain applications of the technology. Still, the US Department of Education told Recode that it hasn’t issued any specific guidance regarding facial recognition.

On the state level, Illinois and Texas have both passed biometric information privacy laws that appear to require consent before using facial recognition. RealNetworks’s Vance says his company notifies schools of this legislation.

Vance also points to a 2014 Florida state law that explicitly bans the collection of biometric information from students. Moore, of Trueface, told Recode that his company has held off on deploying its facial recognition technology in schools because it wants to wait for more clarity regarding regulation.

Meanwhile, at the local level, a city spokesperson for Somerville, Massachusetts — one of the first US cities to ban facial recognition — says that law includes the city’s public schools. However, a legislative aide who worked on San Francisco’s facial recognition ban told Recode the law would not directly apply to public schools (meaning schools could technically buy a facial recognition tool), but that the city’s ban means that San Francisco police couldn’t use or receive information the system might collect.

This overall legal patchwork has left many, including the very companies selling these technologies, desperate for clearer regulation. “Having federal guidelines or federal regulations around facial recognition would be a really good thing for the industry to make sure we’re all playing by the same set of rules,” Vance, of RealNetworks, said.


Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Author: Rebecca Heilweil

Read More

RSS
Follow by Email