Congress Ponders New Laws as Groups Fight Racist Systems
When a facial recognition algorithm selected Robert Julian-Borchak Williams as the likely perpetrator of the theft of $3,800 worth of Shinola watches, Detroit police never questioned its accuracy, despite warnings that secondary confirmation was needed before acting on the AI’s ID, according to an NPR.org story. Even worse, the photo wasn’t even a true likeness of the man they ultimately arrested and held for more than 30 hours.
A New York Times story about the case outlined the fear, shock and dismay that occurred for Williams, who not only had an alibi but immediately pointed out to officers that the headshot from the computer ID didn’t even look like him.
“When I look at the picture of the guy, I just see a big Black guy. I don’t see a resemblance. I don’t think he looks like me at all,” Williams said in an interview with NPR. [The detective] flips the third page over and says, ‘So I guess the computer got it wrong, too.’ And I said, ‘Well, that’s me,’ pointing at a picture of my previous driver’s license, ‘But that guy’s not me,’ ” he said, referring to the other photographs.
“I picked it up and held it to my face and told him, ‘I hope you don’t think all Black people look alike,’ ” Williams said.
When Williams was arrested last January in front of his sobbing children, 2 and 5, he became the first documented case of a false ID by facial recognition software. In response to a request from the New York Times, the Wayne County prosecutor’s office said that Williams could have the case and his fingerprint data expunged. “We apologize,” the prosecutor, Kym L. Worthy, said in a statement, adding, “This does not in any way make up for the hours that Mr. Williams spent in jail.”
Despite recent decisions by Amazon, Microsoft and Facebook against selling facial recognition programs to police, the Times story points out that several smaller companies cater to law enforcement with no oversight, including Vigilant Solutions, Cognitec, NEC, Rank One Computing and Clearview AI.
Clare Garvie, a lawyer at Georgetown University’s Center on Privacy and Technology, told the Times that this is an issue that police departments need to address:
“I strongly suspect this is not the first case to misidentify someone to arrest them for a crime they didn’t commit. This is just the first time we know about it.”
The Detroit Police chief admitted that the facial recognition software, developed by DataWorks Plus, fails 96% of the time, making it a tool of limited value, according to a story on vice.com. Several cities have banned use of facial recognition. Detroit’s Board of Police Commissioners has since enacted new rules for the technology, including using it solely for violent crimes.
A 2019 federal study of more than 100 facial recognition systems found bias involving “falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces,” the Times reported.
Vox.com recently reported on several groups’ efforts to create an algorithmic bill of rights that would include explanations of how AI impacts decisionmaking and ensuring freedom from bias, especially when it comes to police use of AI. The Algorithmic Justice League, founded by MIT researcher Joy Buolamwini, pinpointed the problem when she conducted a thesis project to address facial recognition bias. Others have followed her lead, according to vox.com.
“As the founder and executive director of Data for Black Lives, (Yeshimabeit) Milner has drawn attention to problems with predictive policing (algorithmic systems for predicting where crime is likely to occur) and criminal risk assessments (algorithmic systems for predicting recidivism). Police officers and judges use both these systems to guide their decisions, despite evidence that they’re biased against black people.”
U.S. Senators Ed Markey (D-MA) and Jeff Merkley (D-OR) and Representatives Ayanna Pressley (D-MA) and Pramila Jayapal (D-WA) introduced a bill to ban facial recognition use by federal government officials and to withhold federal funding through the Byrne grant program for state and local governments that use the technology, according to a venturebeat.com story. However, a lack of Republican sponsors likely spells doom for the legislation, at least in 2020.