The Washington, D.C. headquarters building of the World Bank is pictured here. The bank uses an algorithm to distribute financial aid to the poor in Middle Eastern and African countries. (Source; Wikimedia Commons)

Aid System Used Faulty Algorithm to Choose Impoverished Recipients of Financial Assistance

An algorithm failed to perform its job when its faulty logic excluded families in Jordan that were most in need of financial assistance, according to a story on MIT’s technologyreview.com.

In administrating an aid program for the poorest, the World Bank relied on an algorithm that Human Rights Watch discovered not only excluded many deserving families but also rewarded others who had more means. The World Bank focuses on poverty reduction and improving living standards worldwide by providing low-interest loans, interest-free credit, and grants to developing countries for education, health, infrastructure, and communications, among other things.

Takaful’s algorithmic system, which cost over $1 billion, ranked families applying for aid from least poor to poorest using a secret calculus that assigns weights to 57 socioeconomic indicators. The World Bank is using it in Jordan and eight other countries in the Middle East and Africa.

“Applicants are asked how much water and electricity they consume, for example, as two of the indicators that feed into the ranking system. The report’s authors conclude that these are not necessarily reliable indicators of poverty. Some families interviewed believed the fact that they owned a car affected their ranking, even if the car was old and necessary for transportation to work.” The report adds:

“This veneer of statistical objectivity masks a more complicated reality: the economic pressures that people endure and the ways they struggle to get by are frequently invisible to the algorithm.”

A human being might have caught on to the problems inherent in judging poverty by details that may not be significant, but an algorithm would not.

Not surprisingly, the algorithm is also sexist. It “reinforces existing gender-based discrimination by relying on sexist legal codes.” Cash assistance is provided to Jordanian citizens only, and although Jordanian men who marry a noncitizen can pass on citizenship to their spouse, Jordanian women can’t. If they married a non-Jordanian, they are less likely to get aid because they’re reporting a smaller household size.

Human Rights Watch interviewed 70 people to determine the flaws in the system. Amos Toh, an AI and human rights researcher for Human Rights Watch and an author of the report, says government programs that use algorithmic decision-making should be transparent and vetted by disinterested parties.

Meredith Broussard, professor at NYU and author of “More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech,” commented: “It seems like this is yet another example of a bad design that ends up restricting access to funds for people who need them the most.”

read more at technologyreview.com