Wednesday, June 17, 2020

On: Meredith Broussard - Artificial Unintelligence

"There are no such euphemisms in mathematical language. In mathematical language, everything is highly precise."

Are machines smart? Sentient? Conscious and cognizant? Leaving aside the slew of literature and movies from Space Odyssey to Short Circuit to Wall.E, questions of machine learning are pertinent in an age where we rely heavily on technology for almost every part of our day.  The idea of a robotic vacuum cleaner smearing pet excrement around a house might be amusing (unless you're the owner of the pet and robotic vacuum cleaner in question) and an 'easily' repaired problem - how difficult is it to ensure you schedule the Roomba for a different time of day, or better yet, checking that the floor is clear of unpleasant surprises - but how does that extrapolate out to technology in far more important parts of society: to technology being used in health care or education or the justice system, to platforms and algorithms 'deciding' what and who will or won't be seen?

Broussard reminds us in this chapter that all of these so-called instances of learning have a common 'flaw': they are taught/designed/programmed  by human beings, with all their inherent biases, ideologies, and ... ability to make sudden emotive decisions, either by necessity or whim. A computer may be able to predict who was likely to survive or die the sinking of the Titanic, all things - and data -  being equal, but it cannot take into account what might happen if humans behave outside of the preset parameters. After all, the language used in these systems is 'highly precise'.  As benign an example as predicting survival rates of an accident for which we already know the outcome might be, the reality is that these data sets have real impacts on real lives every day. Classifiers without context are inherently problematic - simply because context can and do change and these changes are usually driven by human behavior. 

Another challenge,perhaps less obvious is the challenge of understanding the language required to 'teach' technology. In Broussard's chapter there are around 10 pages of numerical data that by Broussard's own admission are a challenge to process. By distancing ourselves from the ways in which technology 'learns' because they are too hard, we allow  the data sets being designed to continue to be potentially either consciously or unconsciously oppressive, exclusionary, and dismissive.  Creating a more diverse, more equal, and a more just digital, technological, and/or social landscape requires input not just from those who speak that mathematical, precise language but also from those who speak the nuanced, emotive non mathematical languages. 

Wednesday, June 10, 2020

On Safiya Noble - Algorithms of Oppression

It is the persistent normalization of Black people as aberrant and undeserving of human rights and dignity under the banners of public safety, technological innovation, and the emerging creative economy that I am directly challenging by showing the egregious ways that dehumanization is rendered a legitimate free-market technology project  (Noble, 14)

When my now 22 year old son was around eight years old he asked me what an encyclopedia was - as I was explaining that it was a book where you could find information on almost anything, his face cleared and he proclaimed "oh, it was Google".  I suspect that if I were to ask most people in my immediate, familial network for a definition of Google today, it would not differ greatly from my definition of an encyclopedia for Elijah.  

Google has become a portal to all we want to experience and for the vast majority of people, that is the reason it exists: to provide answers. Noble points out that of course that is not raison d'etre for the platform - it is not an information platform but first and foremost an advertising platform and as such it favors advertising algorithms and not information algorithms. Google Search isn't a public resource but an advertising company whose priority and loyalty is to its major advertisers. 

The algorithmic theory and practice behind platforms like Google may present as favoring the majority - if they rank number one doesn't that mean everyone is looking for them? - but they are in fact favoring the advertising dollar. Is this necessarily problematic? Don't  all media favor the advertising that ensures they are able to operate? They do but not all media is as ubiquitous as Google and a platform that is funded by and answerable to such a specific sector of society while proclaiming to serve all, is inherently problematic. 

In Noble's study she applies a Black feminist lens to this problem - noting that in doing so she is asking questions that are pertinent because they are not defined by the group that is served by these algorithms.  Women of color are rarely afforded the humanity that is offered to white women (Nyasha Junior, Ph.D -https://www.bitchmedia.org/article/dont-we-hurt-like-you-black-women-mental-health-depression-representations) and this is exacerbated in the context of web searches. Noble uses the search term 'black girls' to illustrate her point - top results display hypersexualised women intended for male gaze -and predominantly white male gaze. This can be extrapolated to other minorities  - women, queer, trans, ethnic groups - all are reduced to a result that serves an advertising algorithm created in deeply divisive, racist, and sexist framework.  Is it surprising then that when people search a term like #blacklivesmatter that they are going to get results that lean heavily toward supporting a flawed system  - since that flawed system is the one picking up the bill for the continuance of the platform being used? This is the normalizing of  the aberrant that Noble speaks of. It is not simply that these algorithms function this way - it is that we, as users, are conditioned to see this as how things are. This is the way the majority thinks and therefore it is normal. 

Questions I ask or searches I run as a cis-bi woman or those raised by my friend Alex, a trans man of color, might have different focus to those of Noble but they highlight the same problems: that minority voices are oppressed as much by the algorithms driving digital media as they are by actual people - perhaps more so, given that all users are conditioned to take the results at face value. Platforms like Google are built by and operate within a privileged, cis-het, white, male framework. Search results are rendered within that framework to serve that framework putting the labor of correcting these results - not to mention refuting them - back on the minorities whose voices are being oppressed. 

 

Wednesday, June 3, 2020

On Pugliese: Death by Metadata: The Bioinformationalisation of Life and the Transliteration of Algorithms to Flesh

"... if you gather enough metadata, it will supplant the need for 'content'; and that human targets, in the context of meta-driven kills, become so somatechnically instrumentalised as to be  entirely coextensive with the technology they use - in this case, their phones." (pp 4 -5)

While most people are mildly miffed to discover that Facebook, Google, and YouTube are using their data to decide which WISH advert or which politician to force into their social feeds, few are concerned with what their metadata is let alone where it might situating them in the eyes of  - for want of a better term - the powers that be.

Our cellphones are increasingly more akin to our computers than to a telephone - and our increasing reliance on them is the very thing, according to Pugliese, that puts us in the quite literal firing line. He cites the 2014 Reprieve report noting 874 unknowns killed by US drone strike in the hunt for 24 targeted individuals and a 96.5% of casualties from drone strikes as civilians (p7).  Metadata allows for very precise identification in terms of location but dispenses with the need to ensure that the hand holding the phone is in fact that of a terrorist. Collateral damage takes on a new meaning.  "The Reprieve report documents that manner in which certain targeted individuals have been listed as having been killed up to six times, with the result that dozens of unknown civilians have actually been killed by the time the reporting process authenticates a targeted strike." (p7)

The ability to locate and 'identify' metadata in this way allows the individual(s) charged with location to create a template that then identifies the 'owner' of the metadata without actually truly identifying them. A cellphones electronic identifiers become the users identifiers - and more worrying, can be used to 'locate and identify' anyone in proximity. If they are there, they are that person - or more accurately that cell phone.

A drone strike on December 12 2013 in yemen killed 12 people in a wedding procession. According to the Pentagon all who were killed or wounded in the strike were Al Qaeda militants, thus making the strike both lawful and necessary. Testimony from survivors is denounced - and with it their very humanity. By reducing the human targets to algorithms and metadata, they relieve the attackers of the need to consider such ideas (and ideals) as innocence, guilt, and humanity. They are simply targets - and as a post script - hopefully the correct target.

When the victims of strikes are so decimated that it is impossible for survivors to tell child from adult, human from animal, they are stripped of their remaining human identifier - recognition by survivors. furthermore they become what Pugliesi refers to as a "violent enmeshment of the flesh and blood of the body with the geopolitics of war and empire."(p13) This geobiomorphology gives the physical landscape a flesh and tissue - that of those who minutes before passed through it and who now have become a part of it. The faceless, human less attacks, designed and controlled through alogrithms are transformed to creations of actual flesh.

A drone strike might be a way to give precision to an attack, to render it facile, to allow for an objective observation. The resulting landscape of shredded flesh, spattered blood, and dismembered humanity serves to remind us that it is none of these things.

On Galloway : Protocol

The diagram: the distributed network (structural form without center that resembles a web or meshwork)

The technology: the digital computer (an abstract machine able to perform the work of any other machine) 

The management style: protocol (the principle of organisation native to computers in distributed networks)

         from Galloway p 3


In considering the periodizations of modern and post modern society, Galloway recognises (citing Foucault) the sovereign societies of the classical era, during which time power was centralized with the sovereign (and stepping down through the hierarchy) and underpinned with violence and coercion in order to command and control. Modern disciplinary societies were underpinned by bureaucratic command and control. This has now shifted to the decentralized societies of control rather than of discipline. 


In the computerized postmodern age, command and control are found in computerized information management and networked computer at the core of which is protocol (p 6). These commands and controls are not just snippets that make hardware function so we can do our jobs or talk to Great Aunt Gertie back in England, but are measures control that are insidiously present in day to day living by mere function of the extent to which our world is computerized and networked. Regardless of what we use our computers or our devices for, or when, or how, we are all constrained by the protocols by which these machines operate. As users we do not control these protocols - much of the time we are unaware of their presence let alone their function. Our web pages for example comply with Hypertext Transfer Protocol yet  few of us know what that is or how it functions - and yet we submit to its control. These protocols permit us to perform  - possibly even function in many case - secure in the belief we as a computer operator are in control, unaware that we are being controlled.  Galloway gives the extreme but pertinent example that the simple removal of a '.'  from a piece of code can remove an entire nation from a screen, a network, and to all intents and purposes existence.  Extreme? Not when we consider how many times a day we double check a URL or an email to be sure we have a correct spelling. Perfect spelling is not the end goal but compliance. 


Protocols are, according to Galloway, techniques for achieving voluntary regulation within a contingent environment (p7) Because of their universality - HTML and CSS comply with the same protocols regardless of where the user lives, what language they speak, what they use the technology for  - they allow local devices to communicate with foreign ones. Protocols enforce - albeit in a non violent manner - compliance. Information sent from one machine is fluid and non hierarchical but is then processed by a machine that redefines it in a rigid hierarchical manner before then becoming the fluid component ahead of the next step. 

The result is a communicating, non hierarchical, peer to peer relationship between machines and within networks on which we depend for the ongoing functioning of post modern society. 



On Zeynep Tufecki: Twitter and Tear Gas: The Power & Fragility of Networked Protest (Platforms & Algorithms)

When social media began emerging in 2005 it altered and shaped how we - users, producers, consumers, audiences - behaved, not just when on these platforms - Facebook, Twitter, YouTube, et al - but off them as well. Over

the past fifteen years, social media has shaped, among other things,  how we define ourselves, how we consume media, and how we respond to calls for protest and reaction.  Online civic spaces have moved from individual blogs to the behemoths of social networks that dominate our screen time (p 34).  Controlled by algorithms set by the corporations that own the platforms, these privately owned spaces have become our public space to share, to learn, to consume, to market, and to protest. Spaces that on the surface seem to be 'public' are in fact controlled, manipulated, and adapted to meet corporate requirements. Concerns raised by scholars in the early years of social media as to whether these platforms would be restrictive, would enforce censorship, would sell user information have all been realized. Facebook, Twitter and YouTube regularly censor content. Images of same gender couples kissing are regularly banned for violating community guidelines, posts that are considered to spread 'fake news' are removed, posts that might be considered 'hot under the collar' are at best 'screened' and at worst removed. At the same time platforms permit images that show male gaze driven porn, allow hate speech from groups because they have bought advertising, and Zuckerberg recently went on record saying social media shouldn't fact check politicians (while still removing 'ordinary' user posts the alogrithm deems unacceptable - even if they're not hate speech, violent, or untrue) . 


Let us be clear - social media exists not to provide a platform for users but to create money for the parent corporation. Their success is dependent on attracting mass numbers, retaining them, and utilising them in such a way that they can be monetised.  These are networked public spaces that are privately owned with corporate owners making the rules. Facebook has used - and continues to use - the real name policy to shut down groups and pages like "We Are All Khaled Said" without having to be accountable for censorship - violating a business rule is just cause for closure. This policy while lucrative for the overarching corporation is dangerous for individuals. People of colour, LGBTQI+ groups, people of diverse faiths - are required to put themselves in danger in order to satisfy Ts & Cs of a platform that will then censor their activism should it become uncomfortable. Reliant on community policing - on one user reporting another - these policies are underpinned by US laws that require only that the platform remove content they are told violates the law (p 143).  This targets any user who is commenting on or advocating for anything socially or politically sensitive - I have had a photo removed for community standard violation that showed two men, in tuxedos, exchanging wedding vows. Activists especially are at risk of being reported, harassed (online and off), and of being physically harmed. 


Real name policy, 'think of the children' (censorship justification) policy, SPAM policies, verification practices that require sending legal documents through a system that is at best fragile all align a platform with commercial and legal models that prioritise the bottom line. These policies are rarely consistent or even...comprehensible. They disadvantage minority groups that are socially and politically vocal and hide behind protections rarely afforded to those using these platforms as a space to convene, dialogue, and  protest. 


Saturday, May 23, 2020

On Hans Magnus Enzensburger, Constituents Towards a Theory of the Media.


"A revolutionary plan should not require the manipulators to disappear; on the contrary, it must make everyone a manipulator." (Enzensburger , 20)

Enzensburger's Constituents Towards A Theory of the Media  is not a complicated piece but neither is it necessarily simple. His observation that new media in forming new connections are also forming a universal system, that electronic media has a mobilising power, that  consumers are now also producers by virtue of platforms and hardware and technology,  that every use of media presupposes manipulation are as applicable today as they were in the 1970 writing.


With the rise of personal computers, smart phones, tablets, eReaders, wearable technology ranging from watches to spectacles, media equipment is not just a means of consumption but also surveillance, of control, of command, and of production. Media equipment is a means of production as well as consumption. Combined with the rise of platforms - Facebook, Amazon, Google to name only the most obvious, the most ubiquitous of these - media is not only no longer confined to the most obvious formal formats, but with it has come the ability to consume and produce and to disrupt these formats. 

All media, Enzensburger declares, is by definition manipulative. If we set aside the visceral reactions most of us have to the connotations of manipulation we can see the truth of this. Even a piece of poetry is asking for the reader to at least see, if not accept, through the lens of the writer. More so the work of fiction (be it written or filmed), more so the newspaper article (or news broadcast) the text book, the lecture, the manifesto....Even if these were created by a machine, that machine has been programmed and carries with it the biases of the programmer. 

An egalitarianism is inherent in this  - as long as an individual has the equipment necessary, they are able to respond. To act. This ability to take action is one of the hallmarks of new media (21). 

Response, manipulation, the ability to produce - these characteristics give new media (whether by Enzensburger's 1970 definition or by our 21st century definition) the power to mobilise. The 'masses' for want of a better word are no longer simply subjected to that which is chosen by those in spaces of control and command,  but have the ability to respond and react through media. Whether it be through social media, social sharing, through independent publishing, through independent creation - the slush pile no longer silences to the same extent that it once did. Where once private or independent creation and production might have been, as Enzensburger called it in 1970, no more than license cottage industry (p22), it is now characteristic of a shift of power that permitted only a select few a voice, to one that is at least accessible to more - making everyone a manipulator. 
This surface equalisation provides for a somewhat messy, somewhat noisy space - and one that is far from apolitical. Despite pressures for producers - whether mainstream, independent, social, or otherwise - to remain within the frameworks of the socially and aesthetically irrelevant but acceptable (cat GIFS and Supernatural memes anyone?)these spaces by nature present a dynamic relationship between consumers, producers, and platform providers with the boundaries between each of these blurring more and more on a daily basis. 

There may not be, Enzensburger states early in the article, no Marxist theory for the media - that does not leave it without philosophical boundary or framework. Even if that framework is in a constant state of flux as the individual's role within their relationship with media continues to evolve. 

Friday, May 22, 2020

On Post-Work Imaginaries Srnicek & Williams

Taken from their book, Inventing The Future: Postcapitalism and a World Without Work, Post-Work Imaginaries outlines Srnicek and Williams vision of what a post-work society could look like. These ideas are necessarily idealistic, grounded in contemporary reality, and demand a shift in political equilibrium (Srnicek & Williams, 55). 

They argue that society must accept if not embrace full automation, reduce the working week without reducing pay, provide a universal basic income, and diminish if not abolish the work ethic. Indeed, some of these ideas, utopian or not, seem to be in the spotlight as globally we operate in a society informed by Covid-19 responses, the economic impact of the pandemic and of quarantines. 

Full automation they claim "...would aim to liberate humanity from the drudgery of work while simultaneously producing increasing amounts of wealth" (55).  While most of us see automation in terms of decreased employment, decreased incomes, increased problems, and increased profit for the capitalist, Srnicek and Williams argue that resisting automation requires us to choose between freedom and abundance. Automation necessitates, they argue, higher wages (automation is not a viable option when labour is cheap), organisational change, and reskilling.  While the ideal is full automation, it's unlikely to be achievable for several reasons: machines are notoriously bad at completing creative tasks, the cost of the machines required offers a lower profit margin so makes it less interesting to the capitalist (even though this can be countered by full automation), and the moral status we give to certain roles, such as care work.  Therefore, since labour cannot be fully or immediately eliminated the demand for full automation simply aims to reduce necessary labour as much as possible (58).

Reducing the working week is the second demand and one that has to some extent been realised in some places and recently Prime Minister Jacinda Adern spoke of a four-day working week being a possible solution to economic constraints faced by some small businesses post Lockdown 2020. The idea is not new and Srnicek and Williams mention both Lafargue and Keynes pointing to three hour  working days and a shortened week being at the centre of Marx's post-capitalist vision (58).  Reducing the working week -ideally with a three day weekend - is not just good for worker health (both physical and mental) or environmentally advantageous, they argue, it is a political demand that shifts (some) power to the worker. Potentially it can bring recognition to unofficial, unpaid labour by bringing attention to it, and increase productivity.  

Alongside a reduced working week, they call for Universal Basic Income (UBI).  Such an income must, by definition, provide sufficient base come on which to live, must be paid to everyone without condition, and must be a supplement to the welfare state, not a replacement thereof. Again, by making work voluntary and not coerced, power shifts back toward the worker, allowing a flexibility of timetable and activity.  Equally, a UBI would regulate to some extent the values attributed to work, shifting the focus from profit (61). Wages would compensate the nature of a role and not the potential profitability - which may not be of immediate interest to the private capitalist but would certainly be of interest to the worker. UBI would also recognise roles such as care work - often dominated by women - that has traditionally been ignored when defining and recognizing labour. 

The first three 'demands' however rest on the concept of diminishing the work ethic. Work ethic is considered a high value trait from every perspective except that of remuneration. Instead it is exploited to ensure capitalist interests are expanded and profits grown. Only when work ethic is reduced can full automation, reduced working week, and a UBI become the foundation of a balanced and equitable society. 



Srnicek, Nick; Williams, Alex. Inventing the Future: Postcapitalism and a World Without Work