Algorithms that touch the lives of millions are rife with biases that hurt women and minorities.

5 Major Tech Companies Paying Bounties to Hunters of Algorithmic Biases

It looks like AI has created a whole new genre of employment opportunities without even knowing it. There is now a position called “Bias Bounty Hunter.”

A just-released North American Predictions 2022 report from Forrester says five major tech companies are already funding bounty hunters who find problems or leaks within other systems. Or even their own systems.

In late July, Twitter launched the first major bias bounty and awarded $3,500 to a student who proved that its image cropping algorithm favors lighter, slimmer and younger faces. Chock one up for the good guys. We found news about the Forrester report in an article that was written by Tom Ryan for Regarding the first famous bounty paid out, the article quotes an executive from Twitter:

“Finding bias in machine learning (ML) models is difficult, and sometimes, companies find out about unintended ethical harms once they’ve already reached the public,” wrote Rumman Chowdhury, director of Twitter META, in a blog entry. “We want to change that.”

Coders have been unearthing biases in AI-driven algorithms on social media since a programmer in 2015 called out a search feature of the Google Photos app that mistakenly tagged photos of Black people as gorillas.

Forrester’s report says it expects Microsoft and Google will be the next big tech to implement bias bounties. And it’s just as clear that a lot of other brick-and-mortar companies such as banks and healthcare will have to follow.

Human Bias Creates Biased Algorithms

These anti-bias bugs have a big impact on what some people see on the screens in front of them. A bias can affect whether you are hired, or given credit. The amount of damage to individuals might be brought on digitally, but it can transform their lives in very tangible ways.

Wrote Forrester in its predictions report,

“AI professionals should consider using bias bounties as a canary in the coal mine for when incomplete data or existing inequity may lead to discriminatory outcomes from AI systems. With trust high on the agenda of stakeholders, organizations will have to drive decision-making based on levers of trust such as accountability and integrity, making bias elimination ever more critical.”

The position of a bias bounty hunter is long overdue and could produce enough interest from hackers to have an immediate influence. Currently, some people are being offered one kind of life or denied another kind of life due to a biased algorithm created to do the coder’s dirty work, while enriching companies like FaceBook.

Ryan’s article included several interesting links to stories that help fill in the details about bias in algorithms that affect us, what they call the “decision-makers” in real-time.

We have included two of them for your consideration or “decision-making” at and

The entire Ryan article is linked at