Algos & Ethics

Share this post

Credit where credit is due

www.algosandethics.com

Credit where credit is due

Danielle Smalls-Perkins
Dec 4, 2019
7
Share this post

Credit where credit is due

www.algosandethics.com

Alright, so

šŸµ The Tea

If you’ve heard the hype around Apple’s newest product offering, then you’ve likely heard the mess about it too. David Heimer Hason, CTO of Basecamp, recently made a very public complaint related to the different credit limits offered to himself and his partner after applying for the card.

The long and short of that thread is here,

Twitter avatar for @dhh
DHH @dhh
The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.
8:34 PM āˆ™ Nov 7, 2019
29,353Likes9,684Retweets

šŸ““A quick definition — ā€œblack box algorithmā€, here means a process that has an input and outputs/decisions, but no description of how the process works.

Apple representatives respond with encouragement to trust the black box here:

Twitter avatar for @dhh
DHH @dhh
She spoke to two Apple reps. Both very nice, courteous people representing an utterly broken and reprehensible system. The first person was like ā€œI don’t know why, but I swear we’re not discriminating, IT’S JUST THE ALGORITHMā€. I shit you not. ā€œIT’S JUST THE ALGORITHM!ā€.
11:20 PM āˆ™ Nov 8, 2019
4,876Likes592Retweets

and then reveal that her credit score was higher than his, here:

Twitter avatar for @dhh
DHH @dhh
So obviously we both furiously signup for the fucking $25/month credit-check bullshit shakedown that is TransUnion. Maybe someone stole my wife’s identity? Even though we’ve verified there was nothing wrong previously. Guess what: HER CREDIT SCORE WAS HIGHER THAN MINE!!!
11:23 PM āˆ™ Nov 8, 2019
6,039Likes458Retweets

So, the idea that we must now fight the ā€œgod-boxā€ indisputable algorithms in addition to wading through the bureaucracy that is large-company-decision-troubleshooting is likely enough to set anyone over the edge. Anddddd, while the issue was ā€œresolvedā€œ later, Apple’s support team didn’t initially do much to calm his fury.

One of the best summaries of the issue was by Jordan Howard, of everydAI. In the video, Jordan details the apple card issue and also highlights the long history of credit prejudice against women in America.

So, here are the following items I’m happy to discuss at a later date:

  • whether or not interpretable models should be required in this type of decision making

  • the fact that diverse teams should vet the creation and usage of these algorithms before they are released

But, all of the above is solely background context for what this newsletter issue is focused on.

šŸ‘‰ The Point

Shortly after DHH posted his thread, his wife, Jamie Heinemeier Hansson took the opportunity to share her thoughts on the matter.

In just a few paragraphs, JHH expresses that while she is ā€œan extremely private person who does not post on social media,ā€ her exigency to speak out against issues associated with fairness, equality, and justice brought her to the social table.

Personally, I appreciated her emphasis on the fact that others in a similar situation would not have the privilege to receive the same response that she did.

JHH states,

This is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way. Justice for another rich white woman is not justice at all.

She goes on to say,

Finally, I hear the frustration of women and minorities who have already been beating this drum loudly and publicly for years without this level of attention. I didn’t wish to be the subject matter that sparked these fires, but I’m glad they’re blazing.Ā 

and lastly,

On this topic David and I are thoroughly united, and I’m glad his large platform and my AppleCard issue have sparked a national conversation around institutional biases, blackbox algorithms, and the broken system that is our credit industry. This is not a story about me. Brilliant women are all over social media, using their voices to strive for a better way forward. Listen to them.

ā€œWho are these women??ā€, you whisper.

šŸ“š The Scholars

One of the sheer good fortunes of this newsletter is the opportunity to highlight the efforts of those that have done this work for YEARS. JHH did not explicitly name these scholars or their efforts so I’d like to take some time to do that now and in upcoming newsletter issues.

Dr. Ruha Benjamin is one such scholar and activist.

I just downloaded her most recent novel, ā€œRace After Technologyā€, to my e-reader and haven’t been able to put the book down since. Dr. Benjamin sets the stage by highlighting the existence of cultural coding and discrimination in technology that was promised to be unbiased and fair at the very least. She defines this failed expectation as ā€œthe New Jim Codeā€. The New Jim Code is described as,

The employment of new technologies that reflect and reproduce existing inequalities but are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.

Dr. Benjamin uses a classic example study to illustrate the New Jim Code. The original study found that, with all other qualifications being equal, certain names on resumes affected the number of call-backs from employers. Dr. Benjamin states that the researchers of this study found that job applicants with White-sounding names received 50% more call-backs than job applicants with Black-sounding names. The gap in call-backs was estimated as equivalent to 8 years of work experience.

8 years of work experience. Just for having a name that wasn’t as common as an employer thought it ought to be. And before you ask, yes, this included employers with equal opportunity clauses in their job descriptions.

Would anyone like to continue to learn about these stories with me? Let me know if you are interested and I’ll start a book club discussion format so we can be outraged/educated together.

Until we talk again, here is a brief list of more efforts committed to exposing the algorithmic bias in the technology and society of our everyday.

šŸ—£ļøWeapons of Math Destruction by Cathy O’Neill

šŸ—£ļøGender Shades by Joy Buolamwini

šŸ—£ļøAlgorithms of Oppression by Safiya Umoja Noble

šŸ—£ļøProgrammed Inequality by Mar Hicks

I will definitely touch more on these in detail, so if you want to order them to get a head start, that sounds great.

Phew — that was some mess, right? Just wait, there’s more fun ahead!

Talk soon!

Share this post

Credit where credit is due

www.algosandethics.com
Comments
TopNew

No posts

Ready for more?

Ā© 2023 Danielle Smalls-Perkins
Privacy āˆ™ Terms āˆ™ Collection notice
Start WritingGet the app
SubstackĀ is the home for great writing