It’s been a big day for… Listening to...

0:00 10:23

It’s been a big day for… Listening to...

Even Steve Wozniak Is Calling Out Apple's Sexist Credit Card Algorithm

You know it's bad when Steve gets involved.

Here we are again, writing about a piece of technology that has wound up with a sexist algorithm. This time it’s a credit card designed by Apple giving men and women very different credit limits.

The Apple Card is a credit card that was designed by Apple, but it’s run by Goldman Sachs, a massive American bank based in New York. They’ve been contacted by New York’s Department of Financial Services to see if they can find out wtf has happened, but so far haven’t made much progress. 

Goldman Sachs has denied the allegation. This isn’t the first time the financial giant has made headlines – including what’s been called one of the biggest scandals in financial history. 

It all kicked off when software developer David Heinemeier Hansson tweeted that his wife had been offered a credit limit on the card 20 times less than his, despite them filing taxes together, her paying off pays off her limits in full, and eventually paying for a service to tell them that her credit score is higher than his. Basically, it was pure sexism.

The problem was quickly fixed for David’s wife, but others started coming forward with their own similar stories – including Steve Wozniak, the guy who co-founded Apple in the first place. Apparently the card had given Steve a credit limit that was 10 times higher than the one his wife was given.

On top of that, Wozniak has clearly tried to fix the problem before this, saying that it’s “hard to get to a human for a correction though. It’s big tech in 2019.”

The Apple Card isn’t available in Australia.

But when will we learn that algorithms need some serious monitoring? We’ve had the AI that tags you with racist classifications, the sexist recruitment AI, and the chatbot that went off the deep end after less than a day on Twitter. The Apple card’s sexist antics are just the latest in a long line of bad algorithms, and almost certainly won’t be the last.

This is what happens when teams that monitor tech like this aren’t diverse enough to spot any problems. See you all in a couple of months when another algorithm threatens to throw someone off a roof or something equally as horrific.