Built-in racism once again shows up in Facebook’s algorithmic searches of images.

Another Apology From Facebook’s Team on Racist Algorithm Not Enough

When people claim to be ignorant about systemic racism in American society, it’s such a shame. What’s worse, however, is to be in the year 2021 and have racism be front and center on social media platforms recommendation engines. Believe it or not, some of the algorithms you enjoy online each day exhibit racism.

The bbc.com has a story on how Facebook’s algorithm using the word “primate” in reference to a video containing black men. Facebook users who watched a newspaper video featuring black men were asked if they wanted to “keep seeing videos about primates” by an AI recommendation system.

This should not be tolerated, even if it’s related to facial recognition errors. And we can’t help but ask, how can this still be happening?

Facebook told BBC News it “was clearly an unacceptable error,” disabled the system, and launched an investigation. This is not enough. An apology from a spokesman is not enough. And it’s not the first time our major tech companies have continued to enable the systemic racism that is in fact a large part of living in the United States.

A New York Times story on the incident noted that racism is rampant in algorithms, from a police photo search engine that misidentified a black man as a criminal to hiring algorithms that are biased to social media algorithms that make racist errors. MotherJones.com posted a similar story.

In 2015, Google’s Photos app labeled pictures of black people as “gorillas.” The company said it was “appalled and genuinely sorry.” though its fix, Wired reported in 2018, was simply to censor photo searches and tags for the word “gorilla.”

In May, Twitter admitted racial biases in the way its “saliency algorithm” cropped previews of images. White faces were consistently chosen over black ones. How can an algorithm be racist? It’s human racism baked in.

In 2020, Facebook announced a new “inclusive product council”—and a new equity team in Instagram—that would examine, among other things, whether its algorithms exhibited racial bias. Often when a company investigates itself, it comes up with weak explanations and even weaker system fixes. Or like Google, it fires the top-level executive (Tinmit Gebru) it hired to fix the problem, presumably when it becomes clear that the cost to fix it will be astronomical.

Many of us watch day after day as obvious racial slurs, or references to pictures that are meant to be hurtful appear on our screens. Some are references to facial features or hairstyles. Some are misogynistic about women. If Facebook can find and then punish a user for a sexual reference from a post 5 years ago, then they should be able to catch obvious quirks that are purely racist in their algorithms. Facebook, which delivers content to users based on their past browsing and viewing habits, sometimes asks people if they would like to continue seeing posts under related categories. Messages like the “primates” one could be widespread.

There are some companies that are trying to right the wrongs that their programs have caused. Namely IBM.

In his letter to Congress, IBM chief executive Arvind Krishna said the “fight against racism is as urgent as ever,” setting out three areas where the firm wanted to work on: police reform, responsible use of technology and broadening skills and educational opportunities.

“IBM firmly opposes and will not condone the uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms,” he wrote.

It is clear that these racist gaffs cannot be tolerated. Nor can racist facial recognition

And another thing that is clear is perhaps the Section 230 rule that protects tech companies from civil damages should be looked at a little closer. Facebook, Google and other companies need to be held accountable for correcting these problems, and until that happens, they will continue to occur.

read more at bbc.com