A Blog by Jonathan Low

 

Jan 27, 2020

AI-Enabled Surveillance Increasingly Deployed In US Schools

Safety may be the goal, but increased control is the inevitable byproduct. JL

Rebecca Heilweil reports in Re/code:

School administrators across the US are turning to artificially intelligent surveillance tools to beef up school safety. But systems that allow schools to track people on campus have some worried about the impact on student privacy. Analytic surveillance cameras come with new, AI-based software to review where a person has traveled throughout campus using data the system collects about clothing, shape, size, and facial characteristics. It also allows security officials to search camera feeds using physical descriptions like age, gender, and hair color.
As mass shootings at US schools increase in frequency while our country’s gun control laws remain weaker than those in any other developed nation, more school administrators across the US are turning to artificially intelligent surveillance tools in an attempt to beef up school safety. But systems that allow schools to easily track people on campus have left some worried about the impact on student privacy.
Recode has identified at least nine US public school districts — including the district home to Marjory Stoneman Douglas High School (MSD) in Parkland, Florida, which in 2018 experienced one of the deadliest school shootings in US history — that have acquired analytic surveillance cameras that come with new, AI-based software, including one tool called Appearance Search.
Appearance Search can find people based on their age, gender, clothing, and facial characteristics, and it scans through videos like facial recognition tech — though the company that makes it, Avigilon, says it doesn’t technically count as a full-fledged facial recognition tool.
Even so, privacy experts told Recode that, for students, the distinction doesn’t necessarily matter. Appearance Search allows school administrators to review where a person has traveled throughout campus — anywhere there’s a camera — using data the system collects about that person’s clothing, shape, size, and potentially their facial characteristics, among other factors. It also allows security officials to search through camera feeds using certain physical descriptions, like a person’s age, gender, and hair color. So while the tool can’t say who the person is, it can find where else they’ve likely been. For some, this raises big concerns.
“People don’t behave the same when they’re being watched,” warns Brenda Leong, the director of AI and ethics at the Future of Privacy Forum. “Do we really want both young students and high schoolers, and anybody else, feeling like they’re operating in that environment all the time?”
Adding to privacy concerns surrounding a tool like Appearance Search is the fact that it’s not exclusively being used to address violence in schools. School administrators are already using the system to try to intercept bullying, to deter code of conduct violations, and to assist in investigations of school employees.
As Kai Koerber, a recent graduate of MSD, told Recode about the technology: “I don’t think [students] should have to — by going to school — volunteer to accept this kind of new social contract where you’re going to be recorded and traced through your every move. I do think people have the right to be able to walk to the next class without being identified.”

Here’s how Appearance Search works

Imagine you’re a school safety officer monitoring live video-camera feeds on campus. You see a young person you don’t recognize doing something suspicious in a hallway. From your computer, you click on that person’s body. Based on details about that student, like their gender, age, clothing, hair, and potentially what Avigilon calls facial characteristics, you can use Avigilon’s artificial intelligence to mine through video footage collected from cameras all over the school, looking for all instances where someone who resembles that person appears. (Avigilon also sells an AI-based system that detects unusual motion, which, for instance, could notice a student moving through a normally deserted hallway).
Avigilon says Appearance Search isn’t facial recognition. The images aren’t being tied to a particular person’s name or identity, and while the tool can use facial characteristics, it also relies on other aspects of a person’s body to do its job. In an email, a spokesperson says its Appearance Search software “only help[s] measure and rank how similar a pair of images are and [does] not associate the signature with the identity or name of a specific person.”
Avigilon’s surveillance tool exists in a gray area: Even privacy experts are conflicted over whether or not it would be accurate to call the system facial recognition. After looking at publicly available content about Avigilon, Leong said it would be fairer to call the system an advanced form of characterization, meaning that the system is making judgments about the attributes of that person, like what they’re wearing or their hair, but it’s not actually claiming to know their identity.
Varoon Mathur, a technology fellow who studies machine learning at the AI Now Institute, told Recode he considers Appearance Search to offer “object detection.” And Logan Koepke, of the digital rights nonprofit Upturn, called it “person recognition” because the system seems a bit more abstract, focusing on other aspects of someone’s body and not exclusively their face.
But John Davisson, an attorney at the Electronic Privacy Information Center, told Recode Avigilon’s Appearance Search tool is “unequivocally facial recognition,” pointing to how the Federal Trade Commission has defined the technology. And Brian Hofer, who has pushed for several California cities’ facial recognition bans, says Appearance Search appears to meet the Berkeley law’s definition of the controversial technology.
The point is, it’s not actually clear what, exactly, we should call the tool. Despite not identifying people by name like a standard facial recognition tool does, Leong says, “it does do a very similar thing in being able to access a very particular person across time, location, and the environment, and make conclusions about them in equally concerning ways.”
An advertising pamphlet from Avigilon even describes the tool working in this way: One school used Appearance Search to track when a girl entered and exited the bathroom during lunch hours. That allowed a principal to find out she was eating in the bathroom because she was being bullied and then intervene.
Mathur adds that Appearance Search could easily be used in conjunction with standard facial recognition tech. In fact, Avigilon is also rolling out a tool it does call facial recognition — Appearance Alerts — that it’s also selling to schools, though the company won’t reveal how many schools are using this product.

Avigilon Appearance Search raises concerns of surveillance

Avigilon would not share how many schools are using Appearance Search. While Recode identified at least nine public school districts that have acquired or have access to the software, it’s likely many more schools are using the tool.
For instance, the New York Civil Liberties Union says that more than a dozen school districts in New York State have purchased Avigilon equipment. While the NYCLU doesn’t know for certain how many have access to or have used the Appearance Search tool, technology strategist Daniel Schwarz said in an email that “given its inclusion into the main [Avigilon Control Center] software it is likely that a high percentage of schools will have the feature at their fingertips.”
At the schools that have gotten the tool, we already have a sense of how it can be used. In Fulton County, a school district in Georgia, more than 1,400 cameras work with the Avigilon Appearance Search software, and there are plans to install even more. (School officials there appeared in a promotional video for the company.)
In an interview with Recode, Shannon Flounnory, the district’s executive director of safety and security, said Appearance Search has been used to locate children lost in schools, to investigate complaints against staff, and to deter violations of codes of conduct. He says the software has also made the school security staff aware of disciplinary infractions they otherwise would not have known about.
The tool has also been acquired by the Billings Public School District in Montana, Wilson County Schools in Tennessee, and, more recently, Broward County Public Schools, which includes Marjory Stoneman Douglas High School in Florida.
Students have mixed feelings about the tech being rolled out in their schools. One current student at MSD told Recode that Avigilon’s tool could be useful. “I feel like this generation consciously knows we’re always being watched. We have our social media, and everything is caught on video nowadays. So I think it just adds an extra layer,” the student said of privacy concerns. Both Safe Havens International, the nonprofit firm that consulted with Broward County following the shooting at MSD, and officials for the school district declined Recode’s request for comment.
But former MSD student Koerber, who is now a student of the University of California Berkeley, told Recode that Appearance Search seems invasive. “Yes, it may work in terms of, ‘we can identify people who don’t belong on the campus.’ At the same time, we are invading the privacy of each and every student,” he said.
Koerber’s concerns are echoed by student privacy advocates, who say the tool could be used to track and surveil students. “It is surveillance technology, and it is tracking technology, and any school implementing any variation of those is potentially creating more harms and risks than they’re solving,” said Leong.
The “opaque nature of how these tools track people’s movement and behavior is alarming,” Schwarz told Recode. He pointed to how the NYCLU has already convinced one New York school district not to use the tool.
Avigilon did not respond to a request for comment about potential privacy concerns posed by Appearance Search. The company would also not comment on its comparative accuracy rates across demographic groups for either its appearance search or facial recognition-based appearance alert feature, nor would it say what, if any, image databases its tools might be trained on.
Still, school administrators in districts that have used Appearance Search told Recode they hadn’t heard any privacy complaints about the tool. “Right now, we haven’t seen any concerns about how these analytics are utilized. We haven’t gotten concerns from our community about invasions of privacy based upon these analytics,” said Flounnory, arguing that parents are more concerned about keeping their kids safe.
Koerber doesn’t agree: “You really have to ask, ‘What are you willing to sacrifice to play a game of what-if,’” he said. “A school shooting could happen at any time and any place. We know that. But do we need to invade the privacy of every person who enrolls in a particular school to prevent that? I don’t think that’s the case.”

0 comments:

Post a Comment