Editorials

Facial recognition scanners mistook legislators for criminals. Are you next?

Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass., in 2013. Buolamwini’s research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face..
Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass., in 2013. Buolamwini’s research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face.. The Associated Press

When can a case of mistaken identity turn into a dangerous situation? When a facial recognition scanner falsely identifies you as a wanted criminal to a police officer.

If a police officer thinks they’re dealing with a wanted or dangerous criminal instead of an innocent citizen, the situation will likely become tense. Even if nothing violent occurs, no American wants to be stripped of their rights or dignity by being handcuffed and detained when they’ve done nothing wrong.

That’s why the California State Legislature would be wise to ban the use of facial recognition technology programs in body-worn police cameras. Facial recognition technology, which is in the very early stages of development, is too undependable and prone to inaccuracy to deploy as a law enforcement tool.

Case in point: A facial recognition software test conducted recently by the American Civil Liberties Union identified 26 individuals as matches for criminals in mug shots. In reality, the wrongly identified individuals are members of the Legislature.

Assemblymember Phil Ting, D-San Francisco, is one of the legislators falsely matched to a mug shot. He’s also the author of Assembly Bill 1215, which would ban California police departments from using facial recognition technology with their officer-worn, body-worn cameras. Cities like Oakland and San Francisco are already moving to ban the use of facial recognition, and the ACLU’s high-profile publicity stunt has no doubt captured the attention of Ting’s colleagues.

Opinion

“This experiment reinforces the fact that facial recognition software is not ready for prime time - let alone for use in body cameras worn by law enforcement,” said Ting. “I could see innocent Californians subjected to perpetual police line ups because of false matches. We must not allow this to happen.”

Technology has provided police with useful tools to keep our communities safe. The use of familial DNA, which can pinpoint a suspect based on the DNA of their relatives, has solved several high-profile cold cases. Police tracked down Golden State Killer Joseph DeAngelo using such technology. Familial DNA also resulted in the capture of the suspected NorCal Rapist, who terrorized women for 15 years beginning in 1991. Nobody would rather see those guilty of such heinous crimes remain untraceable due to a lack of technology.

License plate readers have also become more common as police tools over the past few years. They allow police to scan thousands of license plates an hour and receive a notification when they hit on the plate of a wanted person. In 2015, a license plate reader allowed Virginia police to locate Vester Flanagan, a man who murdered two reporters on live television. Last week, the technology enabled Elk Grove police to track down a man wanted for murder.

These technologies have the potential to make us safer. At the same time, they pose an unprecedented threat to our civil rights. The idea of a society under constant technological surveillance resembles an Orwellian dystopia more than the United States of America. We must make careful, deliberate choices about the degree to which we accept blanket surveillance in return for the promise of safety.

We must also recognize the fact that some of these technologies have the potential to make the world less safe for certain people. Facial recognition technologies are more likely to wrongly identify people of color, women and young people.

“Too often, minorities are confused for others,” said Assemblymember Reginald Jones-Sawyer. D-Los Angeles. “I’ve heard of far too many cases of mistaken identity leading to arrests, in the worst cases death. This is without technology, which threatens to automate mistaken identity and risk the health and safety of countless people of color.”

Ting is right when he says facial recognition is “not ready for prime time.” Two dozen members of the Legislature just got poignant proof of this. They should act to protect the rights and safety of all Californians by supporting AB 1215.

Related stories from Sacramento Bee

  Comments