Towards the Anti-Racist Algorithm: Digital Alterations of Racial Realities

In January of this year, David Auerbach wrote a very interesting article in Slate entitled, “The Code We Can’t Control.” The article was part review of Frank Pasquale’s Black Box Society. Summarily, both of these works urge us to more carefully consider our assumptions that computers are incapable of perpetuating racism and bias. To some extent both of these works illustrate that while computers do not create racial prejudice, the implicit biases of programmers are responsible for that, computers can perpetuate and contribute to racism by digitizing the logic of discrimination.


Auerbach provided several examples from a recent study by Latanya Sweeny on Google Ad Words that illustrate his previous concerns as well as those of Pasquale. The fact that computers are complicit tools in maintaining a racist architecture urges us to also consider alternatives to our current virtual and physical realities. If in fact computers can be programmed to perpetuate racist structures, we can also program them to perpetuate anti-racist approaches. the same Google ads that serve up arrest records for “black-identified” names can also be used to redirect racists to resources that challenge their misconceptions.  Of course this would fundamentally challenge the idea of search.  The purpose of search is to effectively list what you are looking for not necessarily what you need.  Or is it?  At least we assume, it is.  Should Google or any other search engine be in the business of social engineering? In short, the answer to that question is yes.  Big data search has long been involved in that enterprise from nudging you to buy movie tickets, re-ranking recommended sites, serving ads based on your unique history, etc.  I think the more appropriate question that we should be asking is why do we view a digital intervention to circumvent racial discrimination as unorthodox while we readily accept other forms of social engineering.  The answer to this question is deeper than search engines or algorithms.


The way to reverse racism in the digital universe depends on identical strategies of dismantling it in the physical world. We cannot accept the notion of a de facto state of racial neutrality.  The solution to socioeconomic inequalities has never been in maintaining the status quo. In order for us to effectively deal with racism in the world and online we must actively counter it with alternative visions and algorithms of the world we wish to inherit.

Is Socioeconomic Inequality a Practical Concern?


We have defined technology as many things, perhaps most generally as: the application of scientific knowledge for practical purposes. Is socioeconomic inequality a practical purpose?  If so, why in fact have generations of our brightest engineers failed so marvelously to eliminate the scourge of social injustice and to vanquish racial prejudice?

How one answers this question depends on how one defines technology and its purpose but also how one defines the evolution of human history. If human history is perceived as linear and progressive, as technologists often assume, it is natural to understand the role of technology as factoring into greater advancement of human ideals.  However, if one sees change over time as erratic, irrational, and intractable, one would see technology much more as an expression of human angst than a vehicle of humanity’s salvation.

Regardless of one’s view, technologists have often failed to account for how technology can equally be a force that generates disparate outcomes and reifies existing social hierarchies.  It appears that for every disruptive technology, there are three hegemonic innovations.  Humankind is much more a captive of convention than an engine of revolution.

These tendencies beg the question of whether technology is universally practical at all.  The uses to which the powerful employ innovation are often contrary to the most practical and rudimentary of human needs.  Technological innovation defines practical by vicariously transforming the demands of the powerful to overshadow the longings of the weak.  In education and application, technology often urges us to redefine not only what is possible but also what is meaningful.