AI May Be Enforcing Racism, Rather than Fighting Against It
In America, the struggle for black citizens to be treated fairly by law enforcement officers continues to be a widespread issue. Racial discrimination has existed since the founding of America, when black slaves were counted as 3/5ths of a person. Fast forward 200 years, and the racial divide continues, even at a digital level.
Seeflection.com has reported many times about the racial bias that has been built into many of the algorithms that drive our AI world. Now MIT’s technologyreview.com has dug deep into research and exposed even more problems. Writer Will Douglas Heaven suggests we could just ditch many of these technologies altogether. In particular, Heaven cited predictive police algorithms.
Heaven introduces Yeshimabeit Milner in 200, who followed events and the aftermath. A principal of a school in Miami with a majority Haitian and African-American population had put one of his students in a chokehold. The next day several dozen kids staged a peaceful demonstration. That night, Miami’s NBC 6 News at Six kicked off with a segment called “Chaos on Campus.” (As seen on YouTube.) “Tensions run high at Edison Senior High after a fight for rights ends in a battle with the law,” the broadcast said. Cut to blurry phone footage of screaming teenagers: “The chaos you see is an all-out brawl inside the school’s cafeteria.”
Students told reporters that police hit them with batons, threw them on the floor, and pushed them up against walls. The police claimed they were the ones getting attacked—“with water bottles, soda pops, milk, and so on”—and called for emergency backup. Around 25 students were arrested, and many were charged with multiple crimes, including resisting arrest with violence. Milner remembers watching on TV and seeing kids she’d gone to elementary school with being taken into custody. “It was so crazy,” she says.
She is now the director of Data for Black Lives, a grassroots digital rights organization she cofounded in 2017. What she learned as a teenager pushed her into a life of fighting back against bias in the criminal justice system and dismantling what she calls the school-to-prison pipeline.
“There’s a long history of data being weaponized against Black communities,” she says.
Predictive Programming
Predictive policing tools are either location-based algorithms or people based. The location-oriented algorithms draw on links between places, events, and historical crime rates to predict where and when crimes are more likely to happen. They identify hot spots for police to patrol. One of the most common, called PredPol, which is used by dozens of cities in the U.S., breaks locations up into 500-by-500 foot blocks and updates its predictions throughout the day—a kind of crime weather forecast.
Other tools draw on data about people, such as their age, gender, marital status, history of substance abuse and criminal record. They are used by police to intervene before a crime takes place, or by courts to determine during pretrial hearings or sentencing whether someone who has been arrested is likely to re-offend. For example, a tool called COMPAS, used in many jurisdictions to help make decisions about pretrial release and sentencing, issues a statistical score between 1 and 10 to quantify how likely a person is to be rearrested if released.
Risk assessments have been part of the criminal justice system for decades. But police departments and courts have made more use of automated tools in the last few years, for two main reasons. First, budget cuts have led to an efficiency drive. “People are calling to defund the police, but they’ve already been defunded,” says Milner.
“Cities have been going broke for years, and they’ve been replacing cops with algorithms.” Exact figures are hard to come by, but predictive tools are thought to be used by police forces or courts in most U.S. states.”
Another problem with the algorithms is that many were trained on white populations outside the U.S., partly because criminal records are hard to get hold of across different U.S. jurisdictions. Static 99, a tool designed to predict recidivism among sex offenders, was trained in Canada, where only around 3% of the population is Black compared with 12% in the U.S. Several other tools used in the U.S. were developed in Europe, where 2% of the population is Black. Because of the differences in socioeconomic conditions between countries and populations, the tools are likely to be less accurate in places where they were not trained. Moreover, some pretrial algorithms trained many years ago still use predictors that are out of date. For example, some still predict that a defendant who doesn’t have a landline phone is less likely to show up in court.
Heaven’s story, well worth reading, highlights the widespread failures of these technologies.
read more at technologyreview.com
Leave A Comment