Abu Dhabi, UAESunday 15 December 2019

How women ended up with a much smaller bite of Apple's credit card

'Gender-blind' machine algorithms actually create biases and penalise women more acutely, says Nima Abu Wardeh

Illustration by Gary Clement
Illustration by Gary Clement

Bias is life’s algorithm. The billions spent on addressing this, in an attempt to change people’s mindsets and behaviours, will not do away with bias. Because we’re human, the things we do, create and touch are inherently biased too.

That includes credit lines — or even being allowed to open a bank account in some countries. Apple fell foul of this in spectacular fashion this month. Instead of being in the news for the UK release of its first credit card — initially launched in the US in August — attention was focused on the (huge) difference with respect to the credit lines awarded to men compared to women.

The Apple card doesn’t 'see' gender — and thereby creates a problem, penalising women.

Nima Abu Wardeh

The controversy began with a series of tweets from David Heinemeier Hansson (DHH), a high-profile tech entrepreneur, that stated the card was “sexist” because it gave him 20 times more credit than his wife, even though they file joint tax returns and her credit score is higher. That led to a barrage of comments and complaints on social media and Apple co-founder Steve Wozniak’s public pondering of the machinations of his company’s credit card.

Wozniak's tweet reply that he gets ten times the credit limit of his wife on the Apple card — even though the pair have no separate cards, accounts or assets — was the icing on the bias-cake.

In a subsequent interview, Wozniak said "algos obviously have flaws" and called on government to get involved with regulation. "These sort of unfairnesses bother me and go against the principle of truth," he said.

Goldman Sachs, the issuing bank for the Apple Card, said in a statement: "In all cases, we have not and will not make decisions based on factors like gender."

This is the heart of the issue: the Apple card doesn’t 'see' gender — and thereby creates a problem, penalising women. The way its algorithm determines credit lines makes the risk of bias more acute.

The irony is that an act passed in the US in 1974 that sought to protect women’s rights, is the actual reason for this. I’m referring to the Equal Credit Opportunity Act. Before then, banks required single, widowed or divorced women to bring a man along to co-sign any credit application, regardless of their income. The value of those wages when considering how much credit to grant, were discounted by as much as 50 per cent.

The act made it unlawful for a creditor to discriminate against an applicant, with respect to any aspect of a credit transaction, on the basis of race, colour, religion, national origin, sex, marital status, age or because you get public assistance. Great news, right? Turns out this isn’t the case when it comes to machine learning algorithms.

Goldman followed the rules of the act. When questioned over the Apple credit card ‘behaviour’, Goldman stated that the algorithm doesn’t use gender as an input. How could the bank discriminate if it doesn't know which customers are women and which are men?

Here’s how: a gender-blind algorithm could end up biased against women as long as it’s drawing on any input that happens to correlate with gender. In other words, knowing what products a person buys, where they shop or how they live, can lead to bias because these inputs indicate someone’s gender. The data betraying gender, then betrays the person as their information is used against them — because the algorithm does its magic and decides not to offer them the same credit line as their husbands as we saw in the case of Apple.

There’s a saying in business that what isn’t measured can’t be managed. The fact that financial businesses are prohibited by the Equal Credit Opportunity Act from using information such as gender or race in algorithmic decisions may make the bias problem worse because businesses won't or don't collect this information in the first place. That leaves businesses having to react to the algorithm’s responses.

For example, Amazon had to pull an algorithm used in hiring due to gender bias. IBM and Microsoft were embarrassed by facial recognition algorithms that were better at recognising men than women, and white people than those of other races.

If outing products, people, training and workshops isn’t going to change things any time soon, what can we do about it? I believe giving people at the receiving end of bias tools to speak up, stand up and get their case across is key. They will drive tweaking and building systems that address inequality. In the meantime, if you believe you’re not being treated fairly with regard to your finances, speak up.

The outcry has led to New York regulators opening a discrimination investigation into Goldman’s credit card practices. Good luck to them and to the women of the world who have no choice but to show up and deal with each new bias coming their way. If only liability were biased too — but it isn’t. Funny that.

Nima Abu Wardeh is a broadcast journalist, columnist, blogger and founder of S.H.E. Strategy. Share her journey on finding-nima.com

Updated: November 21, 2019 11:16 AM

SHARE

SHARE