While some of this may be motivated by civil liberties and privacy concerns, there appears to be a recognition that the technology is sufficiently imperfect that the credibility of its findings may be in doubt. JL
Ali Breland reports in The Hill:
The CEO at Kairos wrote that even if one accepted that government surveillance was acceptable, the use of “commercial facial recognition in law enforcement is irresponsible." Amazon employees have pressed CEO Jeff Bezos to stop selling its facial recognition technology to law enforcement. Experts have called it an “imperfect biometric,” and staffers at other facial recognition companies have said that the very least, the technology should not be used as the sole factor in trying criminal suspects.
The CEO of a facial recognition software company on Tuesday said that such technology is not yet fit for use by law enforcement.
“In a social climate wracked with protests and angst around disproportionate prison populations and police misconduct, engaging software that is clearly not ready for civil use in law enforcement activities does not serve citizens, and will only lead to further unrest,” Brian Brackeen the CEO at Kairos wrote in a TechCrunch op-ed.
Brackeen added that even if one accepted the premise that government surveillance was acceptable, the use of “commercial facial recognition in law enforcement is irresponsible and dangerous.”
He argued that it can be “dehumanizing” for the public, and particularly harmful to people of color, who are often unfairly targeted by facial recognition technology.
Other experts like Georgetown researcher Claire Garvey, have called it an “imperfect biometric,” and staffers at other facial recognition technology companies like Roger Rodriguez at Vigilante Solutions have said that the very least, the technology should not be used as the sole factor in trying criminal suspects.
Brackeen’s column comes as a number of Amazon employees have pressed CEO Jeff Bezos to stop selling its facial recognition technology, Rekognition, to law enforcement, as reported by The Hill.
Those employees similarly argue that the “historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses,” could lead to Rekognition “harm[ing] the most marginalized.”
Amazon has defended its software, saying that simply because a technology can be abused doesn’t mean it should be rejected.
0 comments:
Post a Comment