Schools increasing technology to keep students safe
Lockport NY July 31 2018 It was only a matter of time.
As technology increasingly inhabits most aspects of our lives, its latest advances were bound to become part of the anxious debate over school security. It’s not happening on Long Island, or in some burly urban center or a community reeling from a horrific shooting, like the ones that wracked Parkland, Florida, and Santa Fe, Texas. It’s obviously not happening in Congress.
It’s a school district in a small town in western New York that’s pushing the borders of what’s acceptable in the effort to protect students and teachers.
Lockport, an Erie Canal community just east of Niagara Falls, is going to use facial recognition software in its eight schools. It’s a $1.4 million project, and involves installing 300 cameras to watch students and staff as they come and go. Depending on your side of the fence, it’s comforting or creepy.
Supporters say the software would have identified, for example, Nikolas Cruz, the former student accused of killing 17 people at Marjory Stoneman Douglas High School in Parkland.
Lockport’s system compares the faces it scans with a database of expelled students, sex offenders, gang members, disgruntled former employees and other potential troublemakers. So, yes, it likely would have nailed Cruz, who had sounded alarm bells all over Parkland.
But what about Dimitrios Pagourtzis, the student and alleged killer of 10 people in Santa Fe who was not on any official radar?
Opponents point out the slippery slope. On what basis are people placed in the database? How disgruntled does a fired worker have to be? Who has potential to cause trouble? When do cameras start feeling less like protection and more like Big Brother? What if/when the cameras are hacked?
Lockport was the first school system to go this route, but not the only one. A district in Arkansas is spending $300,000 on 200 cameras in two schools. A Texas system is thinking about it.
And there will be other communities trying to decide whether facial recognition software is just another tool to help identify potential bad actors, no different from more security guards’ eyes, or an unwarranted intrusion on the privacy of students and staff.
Districts using the systems will have to deal with inevitable cases of mistaken identification. Facial recognition is notoriously prone to errors, especially with people of color and women. Error rates peaked at 47 percent for the darkest-skinned women in an MIT-Stanford University test. The American Civil Liberties Union reported last week that an experiment it did with Amazon’s facial recognition software falsely matched 28 members of Congress to people who had been arrested for a crime. The misidentifications were disproportionately of people of color, though Long Island’s Lee Zeldin was among those wrongly fingered.
It’s a question of balance — the stress or embarrassment felt by someone misidentified as a threat vs. the possibility of stopping the bloodshed caused by someone like Nikolas Cruz.
Technology always improves, so facial recognition software will get more accurate. At some point — experts say in 3 to 5 years — the false-positive argument will start to fade away. The Big Brother problems will remain.
The New York Civil Liberties Union asked New York’s Education Department to stop Lockport from using the technology, but the state said there’s no legal basis to do that. It also said it’s reviewing policies and practices in trying to balance privacy and safety.
“We shake our heads that we’re having to deal with and talk about these kinds of security issues,” one Lockport official told The Associated Press, as the district moves forward.
There’s going to be a lot more head-shaking going on.
newsday