Researcher Advises Tweaking of Data, Gender Input Reviews

Algorithms can’t always be counted on to do the right thing. The problems with society are often reflected when AI parses data that is already biased by human discrimination actions. The latest case in point involved the Apple Card, a credit card underwritten by Goldman Sachs, which immediately drew criticism for its treatment of women applicants.

According to a story in The New York Times, New York state regulators began investigating the Apple Card after users complained that women were given lower credit limits, despite having better credit ratings than their husbands. Linda Lacewell, director of the New York Department of Financial Services, said they would investigate.

According to David Heinemeier Hansson, he got 20 times the credit of his wife, who had a better credit score and other factors in her favor. Steve Wozniak, the co-founder of Apple with Steve Jobs, said he got 10 times the limit of his wife, who shares all of his financial accounts.

The Apple Card launched in August as a partnership between the tech giant’s Apple Pay program and a new retail consumer-focused effort at Goldman Sachs (FADXX). The companies said the card would be available to consumers who might otherwise struggle to access credit, including those with no credit history or below-average credit scores, making the treatment of women as less worthy than men even more of a blow.

A story in the Observer outlines how Goldman Sachs is trying to do “damage control” to the card’s reputation after the outcry from so many people, including women who complained to the NYDFS.

“What Goldman Sachs failed to take into account is that machine learning algorithms excel at finding latent features in data,” explained Lux Research analyst Cole McCollum. “These are the features that aren’t directly used in training a machine learning model, but are inferred from other features that are.”

McCollum said that even if gender wasn’t explicitly programmed in as a deciding factor, the algorithm made assumptions based on the data about women’s spending limits on its own. The only solution is to program against the potential biases of the data.

“This incident should serve as a warning for companies to invest more in algorithm interpretability and testing in addition to executive education around the subtle ways that bias can creep into AI and machine learning projects,” McCollum concluded.

Democratic Presidential Primary candidate Elizabeth Warren said Goldman Sach’s response to discrimination complaints was inadequate.

“Let’s just tell every woman in America, ‘You might have been discriminated against, on an unknown algorithm, it’s on you to telephone Goldman Sachs and tell them to straighten it out,’” Warren said in an interview with Bloomberg. “Sorry guys, that’s not how it works.”

A Goldman Sachs spokesperson responded to Warren’s comments in a story on cnbc.com:

“We have not and never will make decisions based on factors like gender,” Carey Halio, Goldman’s retail bank CEO, said in a statement. “In fact, we do not know your gender or marital status during the Apple Card application process.”

The bank has pledged to address the problem and asks customers to contact them if they’re unhappy with their credit limit.

According to a story on forbes.com, other companies are already addressing the built-in biases of algorithms against women, including Aire and Credit Kudos by designing a more transparent credit scoring system. The “neobank” Monese is working to make financial products easier to access for people without a credit history.