Ch 5: Retooling solidarity, reimagining justice Do you agree with Benjamin’s argument that we can utilize technologies for solidarity and justice? Why or why not? She
Ch 5: Retooling solidarity, reimagining justice
Do you agree with Benjamin's argument that we can utilize technologies for solidarity and justice? Why or why not? She was writing prior to the murder of George Floyd and the pandemic. Do those issues, in addition to the resistance to internet regulation, give you pause? If there is cause for hope, where do you find it, and how does this hope relate to the questions of ethics as we have discussed them?
Or, if you'd rather learn about Latinx digital studies, you can choose to watch Melissa Villa-Nicholas present Data Borders: How we are all intimately intertwined with detention and deportation, Links to an external site.and write something that is of interest to you from this presentation.
CONTENTS Cover Front Matter Preface Introduction
Everyday Coding Move Slower … Tailoring: Targeting Why Now? The Anti-Black Box Race as Technology Beyond Techno-Determinism Beyond Biased Bots Notes
1 Engineered Inequity I Tinker, Therefore I Am Raising Robots Automating Anti-Blackness Engineered Inequity Notes
2 Default Discrimination Default Discrimination Predicting Glitches Systemic Racism Reloaded Architecture and Algorithms Notes
3 Coded Exposure Multiply Exposed
Exposing Whiteness Exposing Difference Exposing Science Exposing Privacy Exposing Citizenship Notes
4 Technological Benevolence Technological Benevolence Fixing Diversity Racial Fixes Fixing Health Detecting Fixes Notes
5 Retooling Solidarity, Reimagining Justice Selling Empathy Rethinking Design Thinking Beyond Code-Switching Audits and Other Abolitionist Tools Reimagining Technology Notes
Acknowledgments Appendix References Index End User License Agreement
Figures Introduction
Figure 0.1 N-Tech Lab, Ethnicity Recognition
Chapter 1
Figure 1.1 Beauty AI
Figure 1.2 Robot Slaves
Figure 1.3 Overserved
Chapter 2
Figure 2.1 Malcolm Ten
Figure 2.2 Patented PredPol Algorithm
Chapter 3
Figure 3.1 Shirley Card
Figure 3.2 Diverse Shirley
Figure 3.3 Strip Test 7
Chapter 5
Figure 5.1 Appolition
Figure 5.2 White-Collar Crime Risk Zones
RACE AFTER TECHNOLOGY
Abolitionist Tools for the New Jim Code
Ruha Benjamin
polity
Copyright © Ruha Benjamin 2019
The right of Ruha Benjamin to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.
First published in 2019 by Polity Press
Polity Press 65 Bridge Street Cambridge CB2 1UR, UK
Polity Press 101 Station Landing Suite 300 Medford, MA 02155, USA
All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.
ISBN-13: 978-1-5095-2643-7
A catalogue record for this book is available from the British Library.
Library of Congress Cataloging-in-Publication Data Names: Benjamin, Ruha, author. Title: Race after technology : abolitionist tools for the new Jim code / Ruha Benjamin. Description: Medford, MA : Polity, 2019. | Includes bibliographical references and index. Identifiers: LCCN 2018059981 (print) | LCCN 2019015243 (ebook) | ISBN 9781509526437 (Epub) | ISBN 9781509526390 (hardback) | ISBN 9781509526406 (paperback) Subjects: LCSH: Digital divide–United States–21st century. | Information technology–Social aspects–United States–21st century. | African Americans–Social conditions–21st century. | Whites–United States–Social conditions–21st century. | United States–Race relations–21st century. | BISAC: SOCIAL SCIENCE / Demography. Classification: LCC HN90.I56 (ebook) | LCC HN90.I56 B46 2019 (print) | DDC 303.48/330973–dc23 LC record available at https://lccn.loc.gov/2018059981
The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.
Every effort has been made to trace all copyright holders, but if any have been overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.
For further information on Polity, visit our website: politybooks.com
Dedication All my life I’ve prided myself on being a survivor.
But surviving is just another loop …
Maeve Millay, Westworld1
I should constantly remind myself that the real leap
consists in introducing invention into existence …
In the world through which I travel,
I am endlessly creating myself …
I, the [hu]man of color, want only this:
That the tool never possess the [hu]man.
Black Skin, White Masks, Frantz Fanon2
Notes 1. Toye 2016.
2. Fanon 2008, p. 179.
Preface I spent part of my childhood living with my grandma just off Crenshaw Boulevard in Los Angeles. My school was on the same street as our house, but I still spent many a day trying to coax kids on my block to “play school” with me on my grandma’s huge concrete porch covered with that faux-grass carpet. For the few who would come, I would hand out little slips of paper and write math problems on a small chalkboard until someone would insist that we go play tag or hide- and-seek instead. Needless to say, I didn’t have that many friends! But I still have fond memories of growing up off Crenshaw surrounded by people who took a genuine interest in one another’s well-being and who, to this day, I can feel cheering me on as I continue to play school.
Some of my most vivid memories of growing up also involve the police. Looking out of the backseat window of the car as we passed the playground fence, boys lined up for police pat-downs; or hearing the nonstop rumble of police helicopters overhead, so close that the roof would shake while we all tried to ignore it. Business as usual. Later, as a young mom, anytime I went back to visit I would recall the frustration of trying to keep the kids asleep with the sound and light from the helicopter piercing the window’s thin pane. Like everyone who lives in a heavily policed neighborhood, I grew up with a keen sense of being watched. Family, friends, and neighbors – all of us caught up in a carceral web, in which other people’s safety and freedom are predicated on our containment.
Now, in the age of big data, many of us continue to be monitored and measured, but without the audible rumble of helicopters to which we can point. This doesn’t mean we no longer feel what it’s like to be a problem. We do. This book is my attempt to shine light in the other direction, to decode this subtle but no less hostile form of systemic bias, the New Jim Code.
Introduction The New Jim Code Naming a child is serious business. And if you are not White in the United States, there is much more to it than personal preference. When my younger son was born I wanted to give him an Arabic name to reflect part of our family heritage. But it was not long after 9/11, so of course I hesitated. I already knew he would be profiled as a Black youth and adult, so, like most Black mothers, I had already started mentally sparring those who would try to harm my child, even before he was born. Did I really want to add another round to the fight? Well, the fact is, I am also very stubborn. If you tell me I should not do something, I take that as a dare. So I gave the child an Arabic first and middle name and noted on his birth announcement: “This guarantees he will be flagged anytime he tries to fly.”
If you think I am being hyperbolic, keep in mind that names are racially coded. While they are one of the everyday tools we use to express individuality and connections, they are also markers interacting with numerous technologies, like airport screening systems and police risk assessments, as forms of data. Depending on one’s name, one is more likely to be detained by state actors in the name of “public safety.”
Just as in naming a child, there are many everyday contexts – such as applying for jobs, or shopping – that employ emerging technologies, often to the detriment of those who are racially marked. This book explores how such technologies, which often pose as objective, scientific, or progressive, too often reinforce racism and other forms of inequity. Together, we will work to decode the powerful assumptions and values embedded in the material and digital architecture of our world. And we will be stubborn in our pursuit of a more just and equitable approach to tech – ignoring the voice in our head that says, “No way!” “Impossible!” “Not realistic!” But as activist and educator Mariame Kaba contends, “hope is a discipline.”1 Reality is something we create together, except that so few people have a genuine say in the
world in which they are forced to live. Amid so much suffering and injustice, we cannot resign ourselves to this reality we have inherited. It is time to reimagine what is possible. So let’s get to work.
Everyday Coding Each year I teach an undergraduate course on race and racism and I typically begin the class with an exercise designed to help me get to know the students while introducing the themes we will wrestle with during the semester. What’s in a name? Your family story, your religion, your nationality, your gender identity, your race and ethnicity? What assumptions do you think people make about you on the basis of your name? What about your nicknames – are they chosen or imposed? From intimate patterns in dating and romance to large- scale employment trends, our names can open and shut doors. Like a welcome sign inviting people in or a scary mask repelling and pushing them away, this thing that is most ours is also out of our hands.
The popular book and Netflix documentary Freakonomics describe the process of parents naming their kids as an exercise in branding, positioning children as more or less valuable in a competitive social marketplace. If we are the product, our names are the billboard – a symptom of a larger neoliberal rationale that subsumes all other sociopolitical priorities to “economic growth, competitive positioning, and capital enhancement.”2 My students invariably chuckle when the “baby-naming expert” comes on the screen to help parents “launch” their newest offspring. But the fact remains that naming is serious business. The stakes are high not only because parents’ decisions will follow their children for a lifetime, but also because names reflect much longer histories of conflict and assimilation and signal fierce political struggles – as when US immigrants from Eastern Europe anglicize their names, or African Americans at the height of the Black Power movement took Arabic or African names to oppose White supremacy.
I will admit, something that irks me about conversations regarding naming trends is how distinctly African American names are set apart as comically “made up” – a pattern continued in Freakonomics. This
tendency, as I point out to students, is a symptom of the chronic anti- Blackness that pervades even attempts to “celebrate difference.” Blackness is routinely conflated with cultural deficiency, poverty, and pathology … Oh, those poor Black mothers, look at how they misspell “Uneeq.” Not only does this this reek of classism, but it also harbors a willful disregard for the fact that everyone’s names were at one point made up!3
Usually, many of my White students assume that the naming exercise is not about them. “I just have a normal name,” “I was named after my granddad,” “I don’t have an interesting story, prof.” But the presumed blandness of White American culture is a crucial part of our national narrative. Scholars describe the power of this plainness as the invisible “center” against which everything else is compared and as the “norm” against which everyone else is measured. Upon further reflection, what appears to be an absence in terms of being “cultureless” works more like a superpower. Invisibility, with regard to Whiteness, offers immunity. To be unmarked by race allows you to reap the benefits but escape responsibility for your role in an unjust system. Just check out the hashtag #CrimingWhileWhite to read the stories of people who are clearly aware that their Whiteness works for them like an armor and a force field when dealing with the police. A “normal” name is just one of many tools that reinforce racial invisibility.
As a class, then, we begin to understand that all those things dubbed “just ordinary” are also cultural, as they embody values, beliefs, and narratives, and normal names offer some of the most powerful stories of all. If names are social codes that we use to make everyday assessments about people, they are not neutral but racialized, gendered, and classed in predictable ways. Whether in the time of Moses, Malcolm X, or Missy Elliot, names have never grown on trees. They are concocted in cultural laboratories and encoded and infused with meaning and experience – particular histories, longings, and anxieties. And some people, by virtue of their social position, are given more license to experiment with unique names. Basically, status confers cultural value that engenders status, in an ongoing cycle of social reproduction.4
In a classic study of how names impact people’s experience on the job
market, researchers show that, all other things being equal, job seekers with White-sounding first names received 50 percent more callbacks from employers than job seekers with Black-sounding names.5 They calculated that the racial gap was equivalent to eight years of relevant work experience, which White applicants did not actually have; and the gap persisted across occupations, industry, employer size – even when employers included the “equal opportunity” clause in their ads.6 With emerging technologies we might assume that racial bias will be more scientifically rooted out. Yet, rather than challenging or overcoming the cycles of inequity, technical fixes too often reinforce and even deepen the status quo. For example, a study by a team of computer scientists at Princeton examined whether a popular algorithm, trained on human writing online, would exhibit the same biased tendencies that psychologists have documented among humans. They found that the algorithm associated White-sounding names with “pleasant” words and Black-sounding names with “unpleasant” ones.7
Such findings demonstrate what I call “the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.8 Like other kinds of codes that we think of as neutral, “normal” names have power by virtue of their perceived neutrality. They trigger stories about what kind of person is behind the name – their personality and potential, where they come from but also where they should go.
Codes are both reflective and predictive. They have a past and a future. “Alice Tang” comes from a family that values education and is expected to do well in math and science. “Tyrone Jackson” hails from a neighborhood where survival trumps scholastics; and he is expected to excel in sports. More than stereotypes, codes act as narratives, telling us what to expect. As data scientist and Weapons of Math Destruction author Cathy O’Neil observes, “[r]acism is the most slovenly of predictive models. It is powered by haphazard data gathering and spurious correlations, reinforced by institutional inequities, and polluted by confirmation bias.”9
Racial codes are born from the goal of, and facilitate, social control. For instance, in a recent audit of California’s gang database, not only do Blacks and Latinxs constitute 87 percent of those listed, but many of the names turned out to be babies under the age of 1, some of whom were supposedly “self-described gang members.” So far, no one ventures to explain how this could have happened, except by saying that some combination of zip codes and racially coded names constitute a risk.10 Once someone is added to the database, whether they know they are listed or not, they undergo even more surveillance and lose a number of rights.11
Most important, then, is the fact that, once something or someone is coded, this can be hard to change. Think of all of the time and effort it takes for a person to change her name legally. Or, going back to California’s gang database: “Although federal regulations require that people be removed from the database after five years, some records were not scheduled to be removed for more than 100 years.”12 Yet rigidity can also give rise to ingenuity. Think of the proliferation of nicknames, an informal mechanism that allows us to work around legal systems that try to fix us in place. We do not have to embrace the status quo, even though we must still deal with the sometimes dangerous consequences of being illegible, as when a transgender person is “deadnamed” – called their birth name rather than chosen name. Codes, in short, operate within powerful systems of meaning that render some things visible, others invisible, and create a vast array of distortions and dangers.
I share this exercise of how my students and I wrestle with the cultural politics of naming because names are an expressive tool that helps us think about the social and political dimensions of all sorts of technologies explored in this book. From everyday apps to complex algorithms, Race after Technology aims to cut through industry hype to offer a field guide into the world of biased bots, altruistic algorithms, and their many coded cousins. Far from coming upon a sinister story of racist programmers scheming in the dark corners of the web, we will find that the desire for objectivity, efficiency, profitability, and progress fuels the pursuit of technical fixes across many different social arenas. Oh, if only there were a way to slay
centuries of racial demons with a social justice bot! But, as we will see, the road to inequity is paved with technical fixes.
Along the way, this book introduces conceptual tools to help us decode the promises of tech with historically and sociologically informed skepticism. I argue that tech fixes often hide, speed up, and even deepen discrimination, while appearing to be neutral or benevolent when compared to the racism of a previous era. This set of practices that I call the New Jim Code encompasses a range of discriminatory designs – some that explicitly work to amplify hierarchies, many that ignore and thus replicate social divisions, and a number that aim to fix racial bias but end up doing the opposite.
Importantly, the attempt to shroud racist systems under the cloak of objectivity has been made before. In The Condemnation of Blackness, historian Khalil Muhammad (2011) reveals how an earlier “racial data revolution” in the nineteenth century marshalled science and statistics to make a “disinterested” case for White superiority:
Racial knowledge that had been dominated by anecdotal, hereditarian, and pseudo-biological theories of race would gradually be transformed by new social scientific theories of race and society and new tools of analysis, namely racial statistics and social surveys. Out of the new methods and data sources, black criminality would emerge, alongside disease and intelligence, as a fundamental measure of black inferiority.13
You might be tempted to see the datafication of injustice in that era as having been much worse than in the present, but I suggest we hold off on easy distinctions because, as we shall see, the language of “progress” is too easily weaponized against those who suffer most under oppressive systems, however sanitized.
Readers are also likely to note how the term New Jim Code draws on The New Jim Crow, Michelle Alexander’s (2012) book that makes a case for how the US carceral system has produced a “new racial caste system” by locking people into a stigmatized group through a colorblind ideology, a way of labeling people as “criminals” that permits legalized discrimination against them. To talk of the new Jim Crow, begs the question: What of the old? “Jim Crow” was first
introduced as the title character of an 1832 minstrel show that mocked and denigrated Black people. White people used it not only as a derogatory epithet but also as a way to mark space, “legal and social devices intended to separate, isolate, and subordinate Blacks.”14 And, while it started as a folk concept, it was taken up as an academic shorthand for legalized racial segregation, oppression, and injustice in the US South between the 1890s and the 1950s. It has proven to be an elastic term, used to describe an era, a geographic region, laws, institutions, customs, and a code of behavior that upholds White supremacy.15 Alexander compares the old with the new Jim Crow in a number of ways, but most relevant for this discussion is her emphasis on a shift from explicit racialization to a colorblind ideology that masks the destruction wrought by the carceral system, severely limiting the life chances of those labeled criminals who, by design, are overwhelmingly Black. “Criminal,” in this era, is code for Black, but also for poor, immigrant, second-class, disposable, unwanted, detritus.
What happens when this kind of cultural coding gets embedded into the technical coding of software programs? In a now classic study, computer scientist Latanya Sweeney examined how online search results associated Black names with arrest records at a much higher rate than White names, a phenomenon that she first noticed when Google-searching her own name; and results suggested she had a criminal record.16 The lesson? “Google’s algorithms were optimizing for the racially discriminating patterns of past users who had clicked on these ads, learning the racist preferences of some users and feeding them back to everyone else.”17 In a technical sense, the writer James Baldwin’s insight is prescient: “The great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do.”18 And when these technical codes move beyond the bounds of the carceral system, beyond labeling people as “high” and “low” risk criminals, when automated systems from employment, education, healthcare, and housing come to make decisions about people’s deservedness for all kinds of opportunities, then tech designers are erecting a digital caste system, structured by existing racial inequities that are not just colorblind, as Alexander warns. These tech advances are sold as
morally superior because they purport to rise above human bias, even though they could not exist without data produced through histories of exclusion and discrimination.
In fact, as this book shows, colorblindness is no longer even a prerequisite for the New Jim Code. In some cases, technology “sees” racial difference, and this range of vision can involve seemingly positive affirmations or celebrations of presumed cultural differences. And yet we are told that how tech sees “difference” is a more objective reflection of reality than if a mere human produced the same results. Even with the plethora of visibly diverse imagery engendered and circulated through technical advances, particularly social media, bias enters through the backdoor of design optimization in which the humans who create the algorithms are hidden from view.
Move Slower … Problem solving is at the heart of tech. An algorithm, after all, is a set of instructions, rules, and calculations designed to solve problems. Data for Black Lives co-founder Yeshimabeit Milner reminds us that “[t]he decision to make every Black life count as three-fifths of a person was embedded in the electoral college, an algorithm that continues to be the basis of our current democracy.”19 Thus, even just deciding what problem needs solving requires a host of judgments; and yet we are expected to pay no attention to the man behind the screen.20
As danah boyd and M. C. Elish of the Data & Society Research Institute posit, “[t]he datasets and models used in these systems are not objective representations of reality. They are the culmination of particular tools, people, and power structures that foreground one way of seeing or judging over another.”21 By pulling back the curtain and drawing attention to forms of coded inequity, not only do we become more aware of the social dimensions of technology but we can work together against the emergence of a digital caste system that relies on our naivety when it comes to the neutrality of technology. This problem extends beyond obvious forms of criminalization and
surveillance.22 It includes an elaborate social and technical apparatus that governs all areas of life.
The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled – magnified and buried under layers of digital denial. There are bad actors in this arena that are easier to spot than others. Facebook executives who denied and lied about their knowledge of Russia’s interference in the 2016 presidential election via social media are perpetrators of the most broadcast violation of public trust to date.23 But the line between bad and “neutral” players is a fuzzy one and there are many tech insiders hiding behind the language of free speech, allowing racist and sexist harassment to run rampant in the digital public square and looking the other way as avowedly bad actors deliberately crash into others with reckless abandon.
For this reason, we should consider how private industry choices are in fact public policy decisions. They are animated by political values influenced strongly by libertarianism, which extols individual autonomy and corporate freedom from government regulation. However, a recent survey of the political views of 600 tech entrepreneurs found that a majority of them favor higher taxes on the rich, social benefits for the poor, single-payer healthcare, environmental regulations, parental leave, immigration protections, and other issues that align with Democratic causes. Yet most of them also staunchly opposed labor unions and government regulation.24 As one observer put it, “Silicon Valley entrepreneurs don’t mind the government regulating other industries, but they prefer Washington to stay out of their own business.”25 For example, while many say they support single-payer healthcare in theory, they are also reluctant to contribute to tax revenue that would fund such an undertaking. So “political values” here is less about party affiliation or what people believe in the abstract and more to do with how the decisions of tech entrepreneurs impact questions of power, ethics, equity, and sociality. In that light, I think the dominant ethos in this arena is best expressed by Facebook’s original motto: “Move Fast and Break Things.” To which we should ask: What about the people and places broken in the
process? Residents of Silicon Valley displaced by the spike in housing costs, or Amazon warehouse workers compelled to skip bathroom breaks and pee in bottles.26 “Move Fast, Break People, and Call It Progress”?
“Data sharing,” for instance, sounds like a positive development, streamlining the bulky bureaucracies of government so the public can access goods and services faster. But access goes both ways. If someone is marked “risky” in one arena, that stigma follows him around much more efficiently, streamlining marginalization. A leading Europe-based advocate for workers’ data rights described how she was denied a bank loan despite having a high income and no debt, because the lender had access to her health file, which showed that she had a tumor.27 In the United States, data fusion centers are one of the most pernicious sites of the New Jim Code, coordinating “data-sharing among state and local police, intelligence agencies, and private companies”28 and deepening what Stop LAPD Spying Coalition calls the stalker state. Like other techy euphemisms, “fusion” recalls those trendy restaurants where food looks like art. But the clientele of such upscale eateries is rarely the target of data fusion centers that terrorize the residents of many cities.
If private companies are creating public policies by other means, then I think we should stop calling ourselves “users.” Users get used. We are more like unwitting constituents who, by clicking submit, have authorized tech giants to represent our interests. But there are promising signs that the tide is turning.
According to a recent survey, a growing segment of the public (55 percent, up from 45 percent
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.