<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Algos & Ethics]]></title><description><![CDATA[A monthly newsletter committed to sharing the latest within the intersection of Algorithms and Ethics.]]></description><link>https://www.algosandethics.com</link><generator>Substack</generator><lastBuildDate>Sat, 25 Apr 2026 11:25:58 GMT</lastBuildDate><atom:link href="https://www.algosandethics.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Danielle Smalls-Perkins]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[algosandethics@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[algosandethics@substack.com]]></itunes:email><itunes:name><![CDATA[Danielle]]></itunes:name></itunes:owner><itunes:author><![CDATA[Danielle]]></itunes:author><googleplay:owner><![CDATA[algosandethics@substack.com]]></googleplay:owner><googleplay:email><![CDATA[algosandethics@substack.com]]></googleplay:email><googleplay:author><![CDATA[Danielle]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Credit where credit is due]]></title><description><![CDATA[Alright, so]]></description><link>https://www.algosandethics.com/p/credit-where-credit-is-due</link><guid isPermaLink="false">https://www.algosandethics.com/p/credit-where-credit-is-due</guid><dc:creator><![CDATA[Danielle Smalls-Perkins]]></dc:creator><pubDate>Wed, 04 Dec 2019 05:04:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/youtube/w_728,c_limit/Vz5w19BLs88" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Alright, so </p><h2>&#127861; The Tea </h2><p>If you&#8217;ve heard the hype around Apple&#8217;s newest product offering, then you&#8217;ve likely heard the mess about it too. <a href="https://twitter.com/dhh">David Heimer Hason</a>, CTO of <a href="https://basecamp.com">Basecamp</a>, recently made a very public complaint related to the different credit limits offered to himself and his partner after applying for the card. </p><p>The long and short of that thread is here,</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://twitter.com/dhh/status/1192540900393705474?&quot;,&quot;full_text&quot;:&quot;The <span class=\&quot;tweet-fake-link\&quot;>@AppleCard</span> is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple&#8217;s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.&quot;,&quot;username&quot;:&quot;dhh&quot;,&quot;name&quot;:&quot;DHH&quot;,&quot;profile_image_url&quot;:&quot;&quot;,&quot;date&quot;:&quot;Thu Nov 07 20:34:41 +0000 2019&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:0,&quot;retweet_count&quot;:9684,&quot;like_count&quot;:29353,&quot;impression_count&quot;:0,&quot;expanded_url&quot;:{},&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p><em>&#128211;A quick definition &#8212;  &#8220;black box algorithm&#8221;, here means a process that has an input and outputs/decisions, but no description of how the process works.</em></p><p>Apple representatives respond with encouragement to trust the black box here:</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://twitter.com/dhh/status/1192945019230945280&quot;,&quot;full_text&quot;:&quot;She spoke to two Apple reps. Both very nice, courteous people representing an utterly broken and reprehensible system. The first person was like &#8220;I don&#8217;t know why, but I swear we&#8217;re not discriminating, IT&#8217;S JUST THE ALGORITHM&#8221;. I shit you not. &#8220;IT&#8217;S JUST THE ALGORITHM!&#8221;.&quot;,&quot;username&quot;:&quot;dhh&quot;,&quot;name&quot;:&quot;DHH&quot;,&quot;profile_image_url&quot;:&quot;&quot;,&quot;date&quot;:&quot;Fri Nov 08 23:20:31 +0000 2019&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:0,&quot;retweet_count&quot;:592,&quot;like_count&quot;:4876,&quot;impression_count&quot;:0,&quot;expanded_url&quot;:{},&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p>and then reveal that her credit score was higher than his, here:</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://twitter.com/dhh/status/1192945760909676544&quot;,&quot;full_text&quot;:&quot;So obviously we both furiously signup for the fucking $25/month credit-check bullshit shakedown that is TransUnion. Maybe someone stole my wife&#8217;s identity? Even though we&#8217;ve verified there was nothing wrong previously. Guess what: HER CREDIT SCORE WAS HIGHER THAN MINE!!!&quot;,&quot;username&quot;:&quot;dhh&quot;,&quot;name&quot;:&quot;DHH&quot;,&quot;profile_image_url&quot;:&quot;&quot;,&quot;date&quot;:&quot;Fri Nov 08 23:23:27 +0000 2019&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:0,&quot;retweet_count&quot;:458,&quot;like_count&quot;:6039,&quot;impression_count&quot;:0,&quot;expanded_url&quot;:{},&quot;video_url&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><p>So, the idea that we must now fight the &#8220;god-box&#8221; indisputable algorithms in addition to wading through the bureaucracy that is large-company-decision-troubleshooting is likely enough to set anyone over the edge. Anddddd, while the issue was &#8220;<a href="https://twitter.com/dhh/status/1192955093840252931">resolved</a>&#8220; later, Apple&#8217;s support team didn&#8217;t initially do much to calm his fury. </p><p>One of the best summaries of the issue was by Jordan Howard, of <a href="https://www.youtube.com/channel/UC1H1NWNTG2Xi3pt85ykVSHA">everydAI</a>.  In the video, Jordan details the apple card issue and also highlights the long history of credit prejudice against women in America. </p><div id="youtube2-Vz5w19BLs88" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;Vz5w19BLs88&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/Vz5w19BLs88?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p></p><p>So, here are the following items I&#8217;m happy to discuss at a later date:</p><ul><li><p>whether or not interpretable models should be required in this type of decision making</p></li><li><p>the fact that diverse teams should vet the creation and usage of these algorithms before they are released</p></li></ul><p>But, all of the above is solely background context for what this newsletter issue is focused on. </p><h2>&#128073; The Point</h2><p>Shortly after DHH posted his thread, his wife, Jamie Heinemeier Hansson took the opportunity to share her thoughts on the matter.</p><p>In just a few paragraphs, JHH expresses that while she is &#8220;an extremely private person who does not post on social media,&#8221; her exigency to speak out against issues associated with fairness, equality, and justice brought her to the social table.</p><p>Personally, I appreciated her emphasis on the fact that others in a similar situation would not have the privilege to receive the same response that she did.</p><p>JHH states, </p><blockquote><p>This is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way. Justice for another rich white woman is not justice at all.</p></blockquote><p>She goes on to say,</p><blockquote><p>Finally, I hear the frustration of women and minorities who have already been beating this drum loudly and publicly for years without this level of attention. I didn&#8217;t wish to be the subject matter that sparked these fires, but I&#8217;m glad they&#8217;re blazing.&nbsp;</p></blockquote><p>and lastly,</p><blockquote><p>On this topic David and I are thoroughly united, and I&#8217;m glad his large platform and my AppleCard issue have sparked a national conversation around institutional biases, blackbox algorithms, and the broken system that is our credit industry. This is not a story about me. Brilliant women are all over social media, using their voices to strive for a better way forward. Listen to them.</p></blockquote><p><em>&#8220;Who are these women??&#8221;, </em>you whisper.</p><h2>&#128218; The Scholars</h2><p>One of the sheer good fortunes of this newsletter is the opportunity to highlight the efforts of those that have done this work for YEARS. JHH did not explicitly name these scholars or their efforts so I&#8217;d like to take some time to do that now and in upcoming newsletter issues.</p><p>Dr. <a href="https://twitter.com/ruha9">Ruha Benjamin</a> is one such scholar and activist. </p><p>I just downloaded her most recent novel, &#8220;<a href="https://www.ruhabenjamin.com/race-after-technology">Race After Technology</a>&#8221;, to my e-reader and haven&#8217;t been able to put the book down since. Dr. Benjamin sets the stage by highlighting the existence of cultural coding and discrimination in technology that was promised to be unbiased and fair at the very least. She defines this failed expectation as &#8220;the New Jim Code&#8221;. The New Jim Code is described as,</p><blockquote><p> The employment of new technologies that reflect and reproduce existing inequalities but are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.</p></blockquote><p>Dr. Benjamin uses a classic example study to illustrate the New Jim Code. <a href="https://www.nber.org/papers/w9873">The original study</a> found that, with all other qualifications being equal, certain names on resumes affected the number of call-backs from employers. Dr. Benjamin states that the researchers of this study found that job applicants with White-sounding names received 50% more call-backs than job applicants with Black-sounding names. The gap in call-backs was estimated as equivalent to 8 years of work experience. </p><p><em><strong>8 years of work experience</strong></em>. Just for having a name that wasn&#8217;t as common as an employer thought it ought to be.  And before you ask, yes, this included employers with equal opportunity clauses in their job descriptions.</p><p>Would anyone like to continue to learn about these stories with me? Let me know if you are interested and I&#8217;ll start a book club discussion format so we can be outraged/educated together.</p><p>Until we talk again, here is a brief list of more efforts committed to exposing the algorithmic bias in the technology and society of our everyday.</p><p>&#128483;&#65039;<a href="https://weaponsofmathdestructionbook.com/">Weapons of Math Destruction</a> by <a href="https://twitter.com/mathbabedotorg?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor">Cathy O&#8217;Neill</a></p><p>&#128483;&#65039;<a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">Gender Shades</a> by <a href="https://twitter.com/jovialjoy">Joy Buolamwini</a></p><p>&#128483;&#65039;<a href="https://nyupress.org/9781479837243/algorithms-of-oppression/">Algorithms of Oppression</a> by <a href="https://twitter.com/safiyanoble">Safiya Umoja Noble</a></p><p>&#128483;&#65039;<a href="https://mitpress.mit.edu/books/programmed-inequality">Programmed Inequality</a> by <a href="https://twitter.com/histoftech">Mar Hicks</a></p><p>I will definitely touch more on these in detail, so if you want to order them to get a head start, that sounds great.</p><p>Phew &#8212; that was some mess, right? Just wait, there&#8217;s more fun ahead!</p><p>Talk soon!</p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Welcome to Algos and Ethics.]]></title><description><![CDATA[Algorithmic accountability is the game. Join the fun.]]></description><link>https://www.algosandethics.com/p/coming-soon</link><guid isPermaLink="false">https://www.algosandethics.com/p/coming-soon</guid><dc:creator><![CDATA[Danielle Smalls-Perkins]]></dc:creator><pubDate>Tue, 12 Nov 2019 03:32:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!aAhY!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3066842-aa5a-4dac-9a27-eb67d6b966f2_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hey Mom and Dad (and anyone else who is interested), </p><p>Yall are always asking what I&#8217;m interested in. And for that, I&#8217;m grateful.</p><p>Algos &amp; Ethics is my monthly commitment to share information related to algorithmic bias in technology and society.</p><p>We'll talk about bias in algorithms that you use every day and some that you might not be aware of, but should know.</p><p>I&#8217;ll do my best to define things along the way, but please let me know if I missed an opportunity to make a concept plain and simple. </p><p>For example,</p><p><em>&#128211;A quick definition &#8212;  &#8220;an algorithm&#8221;, is a series of steps you follow, like a recipe, that creates a certain outcome. </em></p><p>This is a discussion, so I expect you to share thoughts &#8212; you always do :) </p><p>I&#8217;m hoping the newsletter issues will help illustrate some of the biases around us. Let&#8217;s dive in together.</p><p>First issue, coming up.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.algosandethics.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.algosandethics.com/subscribe?"><span>Subscribe now</span></a></p><p>In the meantime, <a href="https://www.algosandethics.com/p/coming-soon?utm_source=substack&utm_medium=email&utm_content=share&action=share">tell your friends</a>!</p>]]></content:encoded></item></channel></rss>