The New York Department of Financial Services is allegations of gender discrimination against users of the Apple Card, which is administered by Goldman Sachs.
The allegations Saturday after tech entrepreneur David Heinmeier Hansson wrote that Apple Card offered him twenty times the credit limit as his wife, although they have shared assets and she has a higher credit score. Many other users voiced similar experiences — including Apple . Hansson wrote that after reaching out to Apple in an attempt to rectify the situation, he was told credit limits are determined by an algorithm.
The situation throws a shadow over the Apple Card, which as a partnership between the tech giant's Apple Pay program and a new retail consumer-focused effort at Goldman Sachs. The companies had boasted that to consumers that might otherwise struggle to access credit, including those with no credit history or below-average credit scores. But these allegations highlight the , which has been shown in a number of contexts to be biased, make decisions like how much credit to extend to a user.
Linda Lacewell, superintendent of the Department of Financial Services, Saturday the department would "take a look" into the allegations.
In a response to a request for comment on this story, an Apple spokesperson directed CNN Business to Goldman Sachs. Goldman Sachs did not immediately return a request for comment.
Hansson, in his Twitter thread, said the program's decision to offer his wife such a low credit limit was so striking that they feared her identity had been stolen and they paid to check her credit score, finding it was higher than his. Hansson is the founder and chief technology officer at web development firm Basecamp.
Wozniak, who co-founded Apple with Steve Jobs and continues to work for the company, said on Twitter he and his wife had a similar experience with the Apple Card. Though they share all of their assets and accounts, he was offered ten times the credit limit of his wife.
"Some say the blame is on Goldman Sachs but the way Apple is attached, they should share responsibility," Wozniak said.
AI-powered algorithms have in the past, researchers have found. Facial recognition software has trouble identifying women of color, and software used to sentence criminals was found to be biased against black Americans.
That could be an issue as AI technology underpins everything from the speech recognition that makes smart speakers like Siri work to the technology that allows autonomous vehicles to drive themselves; and in the case of the Apple Card, consumer credit assessments.
"Financial services companies are responsible for ensuring the algorithms they use do not unintentionally discriminate against protected groups," Lacewell said on Twitter.
And it's not the first time a tech company has been accused of facilitating discrimination in access to financial services. Facebook is facing that alleges that ads for financial services such as loans and insurance coverage on the platform were targeted away from women and older people over the past three years.