Most recent news stories focus on the facial recognition technology’s mistakes. For example, the ACLU conducted a test of Amazon’s new technology “Rekognition,” and the software misidentified 28 members of Congress as having a previous criminal record.
For the roughly one-in-three Americans, who have some sort of criminal record, it would be an issue. Soon law enforcement will be able to instantly identify returned citizens, and they “will inevitably be targeted, despite having served their time,” the Brookings Institute warned. “Even a perfect facial recognition tool in imperfect hands can lead to unjust outcomes.”
“Privacy is shorthand for…self-development,” writes Julie Cohen, Georgetown University law professor. Such privacy is “vital for individuals returning to society with a criminal record,” the Brookings’ blog states.
Any privacy is uniquely harmed when biometric information (like facial recognition) becomes instantly available.
Much activist attention focuses on the danger of a world where innocents are identified as guilty by a flaw in a new technology. A much bigger risk is “a world where a guilty person can never be anything but a criminal,” states Brookings.
“The few states that have enacted biometric privacy laws have made exceptions for law enforcement,” according to Brookings. Only a few cities have dealt with the law enforcement surveillance risk.
Given the growing efficiency of new biometric technologies, like facial recognition, a counter balancing legal privacy right could aid both the employment and reintegration into a community of returning citizens.