Algorithms were meant to save the world — or at least that’s what companies wanted us to believe. With the ability to make decisions in real-time, it seemed every industry — from banking, insurance, and real estate to health care, IT, and energy — was primed for algorithmic innovation.
But recent claims of discrimination surrounding the Apple Card show we are still far away from this combined utopian goal of high profits and happy customers. According to reports from Bloomberg, Apple and Goldman Sachs are under fire for claims they offer significantly higher credit limits to men than women.
In the viral tweet thread last Thursday that started it all, tech entrepreneur David Heinemeier Hansson called the Apple Card a “f—ing sexist program,” saying he received a credit limit 20 times higher than his wife, Jamie Heinemeier Hansson, despite their joint-tax status and the fact that his wife has a higher credit score.
According to the tweets, the Hanssons called Apple Card customer service twice to resolve the issue, but the representatives told them the proprietary technology that determined credit limits was not discriminatory and refused to increase Jamie Heinemeier Hansson’s available credit.
Apple co-founder Steve Wozniak also weighed in, tweeting that although he and his long-time wife share all financial assets and accounts, he received a credit limit 10 times higher than hers.
Goldman Sachs, Apple’s credit issuer, denies that gender was a determining factor. “As with any other individual credit card, your application is evaluated independently,” a Goldman Sachs spokesperson wrote in a statement. “We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much personal debt you have, and how that debt has been managed. Based on these individual factors, it is possible for two family members to receive significantly different credit decisions.”
In response to Hansson’s tweets, the New York Department of Financial Services (NYDFS) opened a probe to examine whether or not Goldman Sachs’ algorithm discriminates against women when doling out credit limits. Under state law, gender, along with age, creed, race, color, sexual orientation, and national origin, is a protected class.
Reviews.com got in touch with another spokesperson from Goldman Sachs on Monday. While we weren’t able to find a time to speak with them before publication, they did confirm that Goldman will cooperate with the probe. We will update this story if additional comment is provided by Goldman. Apple did not respond to multiple requests for comment.
Linda A. Lacewell, the superintendent of NYDFS, writes that the black-box problem — in which the mechanics of artificial technology are shielded from the public — “has led to consumers [having] little visibility into how a decision is made or why they have been rejected.”
“Algorithms are not only nonpublic, they are actually treated as proprietary trade secrets by many companies,” Rohit Chopra, a commissioner from the Federal Trade Commission, said in October. “To make matters worse, machine learning means that algorithms can evolve in real time with no paper trail on the data, inputs, or equations used to develop a prediction.”
This leads to wide distrust of tech companies amongst consumers, calling into question the systems that govern people’s livelihoods. Just what goes into an algorithm anyway?
It wasn’t too long ago that credit was considered a tool only for the privileged. For example, well into the 20th century, many women could not get a credit card without their husbands — or just any man, in general — co-signing on the application. It was only in 1974, when the Equal Credit Opportunity Act was passed, that gender discrimination and other forms of discrimination were declared illegal.
Sexism persists into today’s financial world, but sometimes the smoking gun is not easy to find.
“Algorithmic bias isn’t as blatant as IF WOMAN THEN CREDIT * 0.1. It’s biased in its assumptions and its premise,” Hansson tweeted. “When both of those factors are shrouded in secret, it’s almost impossible to challenge.”
This is a central tension: Companies are often unable or unwilling to explain the decision making that goes into these complex models, yet stand behind them and deny culpability, Hansson said.
He cites his wife, currently a stay-at-home mother, as an example. “What percentage of ‘homemakers’ do you think are women? If they’re unable to rely on shared wealth, household income, or historical earning power to demonstrate ability to repay credit, they’re cut off.”
Similar situations have played out in the lending sector: A 2018 UC Berkeley study found that mortgage lenders charged higher interest rates on average to black and Latino borrowers. And in March, the U.S. Department of Housing and Urban Development charged Facebook with housing discrimination, alleging the platform allowed housing, job, and credit advertisers to exclude certain races from seeing ads, both explicitly within the Facebook interface and unintentionally through machine learning.
Even if institutions did not intend to discriminate against women, the outcome is the same. Algorithms are sets of rules based on assumptions — and when the assumptions are flawed, they can reflect and uphold structural inequalities baked into institutions.
The finance and tech sectors have notorious diversity issues when it comes to the hiring, treatment, and retention of women and minorities, especially in higher-level positions. Is it a wonder that a self-learning computer program may deem a woman less creditworthy as a result?
However, just as discrimination can be perpetuated — through malice, ignorance, or apathy — it can be undone through concentrated effort.
On Friday, Hansson tweeted that Apple Card customer service ended up increasing Jamie Heinemeier Hansson’s credit limit to match her husband’s, without requesting additional documentation or offering explanation.
However, Hansson believes her individual compensation is not enough when contending with the other stories of alleged credit discrimination that have to come light over the last several days. She wrote in a blog post, “This is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way.”
Over the past several years, as companies like Google and Facebook have become further entrenched in everyday life, tech workers and academics have taken matters into their own hands. They’ve formed advocacy groups like the Algorithmic Justice League and research divisions like New York University’s AI Now Institute, and have written extensively on the deleterious effects of algorithms that reinforce institutional bias against marginalized groups, such as women, racial and ethnic minorities, and members of the LGTBQ+ community.
Cathy O’Neill, author of “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” suggests companies adopt a multi-pronged solution that includes transparency into the inputs that these algorithms consider, an appeals process for customers who believe they have been treated unfairly, third-party auditing, and publishing their credit standards so other companies can follow suit.
There’s also the matter of governance. In an interview with Bloomberg, Wozniak called for more Wall Street regulation to curb issues of discrimination. “Consumers can only be represented by the government because the big corporations only represent themselves,” he said.
In April, Rep. Yvette Clark (D-NY-14), Sen. Cory Booker (D-NJ), and Sen. Ron Wyden (D-OR) introduced the Algorithmic Accountability Act of 2019 to the House and Senate, calling for transparency and oversight over “automated decision systems” and the factors that “may result in or contribute to inaccurate, unfair, biased, or discriminatory decisions impacting consumers.”
Additionally, two of the three top democratic candidates for president, Sens. Bernie Sanders and Elizabeth Warren, have made strict regulation of the finance and tech sectors integral to their campaigns.
How these specific calls for change will play out remains to be seen, but the virality of Hansson’s tweets indicates at least some consumers are fed up with the status quo.
If you think you have experienced anything similar with a credit card, the New York State Department of Financial Services wants to hear from you. Per a statement, people are invited to send an email to firstname.lastname@example.org. Additionally, experts and members of the tech community are invited to reach out at email@example.com, in an effort to help create standards around new technologies in financial services.
Here are other options you have:
- Consult with your state’s attorney general or a consumer attorney
- Report your issue to the relevant federal agency: Better Business Bureau, Consumer Financial Protection Bureau, Federal Trade Commission, Federal Reserve Board, Office of the Comptroller of the Currency