Wednesday, June 10, 2020

On Safiya Noble - Algorithms of Oppression

It is the persistent normalization of Black people as aberrant and undeserving of human rights and dignity under the banners of public safety, technological innovation, and the emerging creative economy that I am directly challenging by showing the egregious ways that dehumanization is rendered a legitimate free-market technology project  (Noble, 14)

When my now 22 year old son was around eight years old he asked me what an encyclopedia was - as I was explaining that it was a book where you could find information on almost anything, his face cleared and he proclaimed "oh, it was Google".  I suspect that if I were to ask most people in my immediate, familial network for a definition of Google today, it would not differ greatly from my definition of an encyclopedia for Elijah.  

Google has become a portal to all we want to experience and for the vast majority of people, that is the reason it exists: to provide answers. Noble points out that of course that is not raison d'etre for the platform - it is not an information platform but first and foremost an advertising platform and as such it favors advertising algorithms and not information algorithms. Google Search isn't a public resource but an advertising company whose priority and loyalty is to its major advertisers. 

The algorithmic theory and practice behind platforms like Google may present as favoring the majority - if they rank number one doesn't that mean everyone is looking for them? - but they are in fact favoring the advertising dollar. Is this necessarily problematic? Don't  all media favor the advertising that ensures they are able to operate? They do but not all media is as ubiquitous as Google and a platform that is funded by and answerable to such a specific sector of society while proclaiming to serve all, is inherently problematic. 

In Noble's study she applies a Black feminist lens to this problem - noting that in doing so she is asking questions that are pertinent because they are not defined by the group that is served by these algorithms.  Women of color are rarely afforded the humanity that is offered to white women (Nyasha Junior, Ph.D -https://www.bitchmedia.org/article/dont-we-hurt-like-you-black-women-mental-health-depression-representations) and this is exacerbated in the context of web searches. Noble uses the search term 'black girls' to illustrate her point - top results display hypersexualised women intended for male gaze -and predominantly white male gaze. This can be extrapolated to other minorities  - women, queer, trans, ethnic groups - all are reduced to a result that serves an advertising algorithm created in deeply divisive, racist, and sexist framework.  Is it surprising then that when people search a term like #blacklivesmatter that they are going to get results that lean heavily toward supporting a flawed system  - since that flawed system is the one picking up the bill for the continuance of the platform being used? This is the normalizing of  the aberrant that Noble speaks of. It is not simply that these algorithms function this way - it is that we, as users, are conditioned to see this as how things are. This is the way the majority thinks and therefore it is normal. 

Questions I ask or searches I run as a cis-bi woman or those raised by my friend Alex, a trans man of color, might have different focus to those of Noble but they highlight the same problems: that minority voices are oppressed as much by the algorithms driving digital media as they are by actual people - perhaps more so, given that all users are conditioned to take the results at face value. Platforms like Google are built by and operate within a privileged, cis-het, white, male framework. Search results are rendered within that framework to serve that framework putting the labor of correcting these results - not to mention refuting them - back on the minorities whose voices are being oppressed. 

 

No comments:

Post a Comment