Facial Recognition Software Prompts Privacy, Racism Concerns in Cities and States

Fabian Rogers was none too pleased when the landlord of his rent-stabilized Brooklyn high-rise announced plans to swap out key fobs for a facial recognition system.

He had so many questions: What happened if he didn’t comply? Would he be evicted? And as a young black man, he worried that his biometric data would end up in a police lineup without him ever being arrested. Most of the building’s tenants are people of color, he said, and they already are concerned about overpolicing in their New York neighborhood.

“There’s a lot of scariness that comes with this,” said Rogers, 24, who along with other tenants is trying to legally block his management company from installing the technology.

“You feel like a guinea pig,” Rogers said. “A test subject for this technology.”

Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition technology, some city and state officials are proposing to limit its use.

Law enforcement officials say facial recognition software can be an effective crime-fighting tool, and some landlords say it could enhance security in their buildings. But civil liberties activists worry that vulnerable populations such as residents of public housing or rent-stabilized apartments are at risk for law enforcement overreach.

“This is a very dangerous technology,” said Reema Singh Guliani, senior legislative counsel for the American Civil Liberties Union. “Facial recognition is different from other technologies. You can identify someone from afar. They may never know. And you can do it on a massive scale.”

The earliest forms of facial recognition technology originated in the 1990s, and local law enforcement began using it in 2009. Today, its use has expanded to companies such as Facebook and Apple.

Such software uses biometrics to read the geometry of faces found in a photograph or video and compare the images to a database of other facial images to find a match. It’s used to verify personal identity — the FBI, for example, has access to 412 million facial images.

“Our industry certainly needs to do a better job of helping educate the public how the technology works and how it’s used,” said Jake Parker, senior director of government relations for the Security Industry Association, a trade association based in Silver Spring, Maryland.

“Any technology has the potential to be misused,” Parker said. “But in the United States, we have a number of constitutional protections that limit what the government can do.”

Read More