Apple Card row: why it doesn't pay to have AI with a gender bias

Artificial intelligence can only improve our lives if its algorithms eradicate prejudices based on gender and race

To mark the occasion of International Women's Day on March 8, 2018 AFP presents a series of 45 photos depicting women performing roles or working in professions more traditionally held by men.  More images can be found in www.afpforum.com  Search SLUG  "WOMEN-DAY -PACKAGE". 
This picture taken on February 23, 2018 shows Ran Namise, 24, a firefighter belonging to the command squad, posing in front of a fire engine at Kojimachi Fire Station in Tokyo. 

In the ring, battling flames or lifting off into space, women have entered professions generally considered as men's jobs. For International Women's Day, AFP met with women breaking down the barriers of gender-bias in the work world. / AFP PHOTO / Kazuhiro NOGI
Powered by automated translation

Artificial intelligence is often promoted as a tool to improve the way we go about our lives. It is sold on the notion that the technical wizardry behind it is free of the prejudices that plague human beings. However, we are quickly learning that in some cases, AI can reinforce our biases and embed discrimination.

This week, for example, an investigation was launched into claims that Apple Card discriminates against women after US tech entrepreneur David Heinemeier Hansson tweeted that he had received a credit card limit that was 20 times greater than his wife's.

In a long thread, he argued that the algorithm used by the credit card application process was flawed, showing clear bias against female applicants. He even checked their respective credit scores via a financial service, as Apple Card recommended, and said his wife scored better than he did. Apple co-founder Steve Wozniak then chimed in to say he and his wife had experienced the same disparity.

FILE - In this March 25, 2019, file photo, Jennifer Bailey, vice president of Apple Pay, speaks about the Apple Card at the Steve Jobs Theater in Cupertino, Calif. A spokeswoman for the New York Department of Financial Services confirmed Saturday, Nov. 9, 2019 that they are investigating Goldman Sachs for possible sex discrimination in the way it sets credit limits. (AP Photo/Tony Avelar, File)
The Apple Card: New York Department of Financial Services is investigating Goldman Sachs for possible sex discrimination in the way it sets credit limits. Tony Avelar / AP

We are well into the 21st century but this alleged discrimination against women feels like a throwback to the 1950s and 1960s, when credit cards first became popular. During that era, banks in the US and elsewhere could refuse to issue a credit card to unmarried women and if they were married, their husband were required to co-sign. To add further insult, up to half their income could be discounted when calculating their credit limit.

The inability to secure finance – whether through access to financial products or gender pay equality – is a powerful tool of control on women’s economic freedom.

From a legal perspective, equality of access to credit was addressed in the US in 1974, when the Senate passed the Equal Credit Opportunity Act, which made it illegal to discriminate against someone based on their gender, race, religion or origin.

Despite this law, this week’s Apple Card incident highlights how little we have learned when it comes to enabling women’s access to finance. According to a 2012 study by the Financial Industry Regulatory Authority in the US, women pay 0.5 per cent interest more than men.

Defenders of such systems say algorithms are neutral – but they are created by human beings who have biases and use data sets that reflect historic prejudices

It is not just women who apparently suffer from bias embedded in algorithms. Last month New York's insurance regulator launched an investigation into United Health Group amid claims its computerised tools favoured white healthy patients over sick black ones.

Defenders of such systems say algorithms are neutral – but they are created by human beings who have biases and use data sets that reflect historic prejudices. The conclusions drawn by algorithms therefore risk being informed by bias.

Take the study carried out by Pew Research Centre of gender representation in typical searches carried out on Google images. Researchers compared results with official US administration figures and found Google significantly under-represented women as managers and chief executives. While women make up 28 per cent of chief executives, they only appeared in 10 per cent of the top 100 Google search results. Search for general manager and women will make up just 15 per cent of the search results, despite accounting for 34 per cent of the workforce.

The clear lesson from this is that artificial intelligence and the algorithms and technology upon which it is based cannot be assumed to be bias-free. Significant and rigorous steps must be taken to mitigate such biases.

Goldman Sachs, the company that manages Apple Card, has denied discrimination and said the algorithm was working as it was supposed to, based on criteria such as “an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores”. But it is worth remembering that the way creditworthy behaviour is often calculated is the modern-day equivalent of 1960s credit-scoring, which discounted women’s wages, independence and reliability.

"Success in creating AI would be the biggest event in human history,” the late Stephen Hawking once said. “Unfortunately, it might also be the last, unless we learn how to avoid the risks." One of the biggest risks is locking discrimination of the past into our futures. That must be avoided at all costs.

Shelina Janmohamed is the author of Love in a Headscarf and Generation M: Young Muslims Changing the World