The almighty algorithm is seemingly to blame for discriminating against women and giving them lower credit limits when applying for an Apple credit card.
The matter was brought to the general public’s attention by Ruby on Rails creator David Heinemeier Hansson. The developer tweeted out that the card was sexist because despite having a worse credit score than his wife, he was granted a credit limit 20 times higher than that of his wife.
Upon further investigation Hansson’s wife was told by an Apple representative that Apple wasn’t discriminating and the lower credit limit was decided by an algorithm.
The Twitter thread quickly went viral and caught the attention of other Apple card users who had experienced the issue including Apple co-founder Steve Wozniak.
The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It's big tech in 2019.
— Steve Wozniak (@stevewoz) November 10, 2019
Unfortunately for Apple, superintendent of the New York State Department of Financial Services, Linda Lacewell, also saw the Twitter thread.
“On Saturday morning, I read a Twitter thread from an Apple Card user – tech entrepreneur David Heinemeier Hansson – detailing how his card’s credit limit was considerably higher – twenty times – than that of his wife, despite his wife having a higher credit score. I responded, announcing that the New York State Department of Financial Services (DFS) would examine whether the algorithm used to make these credit limit decisions violates state laws that prohibit discrimination on the basis of sex,” Lacewell wrote on Medium.
The superintendent goes on to say that the fact consumers have very little insight into how a decision is made when it comes to your Apple credit card’s limit makes the situation a lot worse.
Of course Apple is not the firm making a decision on credit worthiness, the bank backing the credit card is and that bank is Goldman Sachs.
The bank addressed the matter in a tweet stating it does not make decisions based on gender or other similar factors.
“As with any other individual credit card, your application is evaluated independently. We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed. Based on these factors, it is possible for two family members to receive significantly different credit decisions,” said Goldman Sachs spokesperson, Andrew Williams.
Of course the other way folks could be getting wildly different credit limits is a biased algorithm.
That’s not to say that the developers of the algorithm intended for it to be discriminatory but rather that certain factors were not accounted for.
What stuns us is the Apple representative’s insistence on blaming a faceless algorithm right off of the bat. We understand a human isn’t responsible for the decision, but maybe that’s the problem.
While algorithms do make work easier and can come to decisions faster, if those systems aren’t tuned correctly, the algorithm will spit up results that cause more headaches than a human might.
[Via – The Verge]