::fibreculture:: Paranoia is Real: Algorithmic Governance and the Shadow of Control

Ned Rossiter ned at nedrossiter.org
Fri Jun 9 13:37:35 CEST 2017


Fake News from the Art and Politics Bureau
Art and Design Black Box, National Institute for Experimental Art, UNSW
9 June 2017
http://niea.unsw.edu.au/events/fake-news-art-and-politics-bureau-one-day-event


Paranoia is Real: Algorithmic Governance and the Shadow of Control

Ned Rossiter

In his economic history of the present, Philip Mirowski writes the
following: ‘In the topsy-turvy world of neoliberalism you may think that
you are busily expressing your innate right to protest the cruel and
distorted state of the world; but in most cases, you are echoing scripts
and pursuing an identity that has already been mapped out and optimized
beforehand to permit the market to evaluate and process knowledge about
you, and convey it to users with deepest pockets’.[1]

Let’s unpack this a little more. What are the scripts that predetermine
our action in the world? Well, most immediately, they are socially
acquired behaviours that we learn and reproduce across a range of
institutional, cultural and political settings. We rehearse and perform
various identities throughout our life. But what of the algorithmic
dimension to such scripts? What are the rules and parameters by which
our gestures – political or otherwise – are signalled to people and
machines, animals and things? Can a distinction be made between real or
true gestures and their fake equivalents? All gestures and actions are
necessarily rehearsed and performed. Even spontaneity has its
precedents. There is never an original to which a reproduced gesture may
refer back to. Rather, we inhabit what Baudrillard impressed upon
readers of so-called postmodern theory a couple of decades ago as *the
simulation of the real*. Baudrillard was never a believer in fakes.
Neither was Warhol. Or rather, fake for them was the new orbit of reality.

So why, now, have notions of post-truth politics and fake news gained a
renewed currency? Of course the immediate reference here is to Trump.
One can also point to the ways in which platform capitalism organizes
our experience of the world through parametric architectures predicated
on the logic of the filter. But it seems to me the post-truth, fake news
world is more symptomatic of the return of positivism and the pervasive
reach it holds across disciplines that should know better. Knowledge has
submitted to regimes of measure and calculability that are the
techno-ontological core of the digital.

An epistemic horizon of neo-positivism, in other words, conditions the
legitimacy of post-truth, fake worlds in which the analytical capacity
to decide and distinguish is subordinate to the power of affect coupled
with the vulnerability of subjectivity parsed with algorithmic machines.
To orchestrate a foundation for legitimacy, discourses, practices and
imaginaries are correlated with technologies of extraction and
calculation. Subjectivity is modulated in ways that gravitates toward
collective self-affirmation and the promise of security. The modern
history of fascist movements demonstrates this well, as does the popular
story by George Orwell, which is why Trump is so easily drawn into that
trajectory of control.

The call to this event invites us to explore fake tactics as a mode of
intervention. I would like to flip this around and consider strategies
of coping. I’m less interested in therapies of the self here than what I
would call *paranoia as method*. This is an idea and analytical proposal
I only gestured at in my book from last year on logistical media theory.
So I thought I would take the occasion of this event to develop a method
of paranoia as a diagnostic device that might assist our political and
subjective orientation in worlds of algorithmic governance and data
economies. Consider this a form of shadow-knowledge.

With the Snowden revelations of the NSA’s PRISM surveillance machine,
the scale and scope of paranoia is grafted to the modulation of affect,
intensity, and uncertainty to the extent that new techniques, methods,
and tactics are required if political movements, corporate secrets, and
government communiqués are to design cryptographic systems robust enough
to withstand the analytic reach of NSA surveillance programs and their kin.

The British filmmaker Adam Curtis is probably one of the most consistent
practitioners of paranoia as method. Helped along by repetitive strains
of eerie Brian Eno soundscapes that tie Curtis’s tantalizing editing of
archival news and documentary footage, his series of films exploit the
verisimilitude of the documentary genre in an analysis of geopolitical
power and the manufacturing of society gone to the dogs. Key titles
include: The Century of the Self (2002), The Power of Nightmares (2004),
The Trap: What Happened to Our Dream of Freedom (2007), All Watched Over
by Machines of Loving Grace (2011) and HyperNormalisation (2016). This
last film in particular hones in on the systemic production of fakeness.

Yet the question of fake news seems to me predicated on the logic of
representation. But if we are in general agreement that, following Félix
Guattari, our epoch is one that has moved from a logocentric world to a
machinic world, a world of ‘complex assemblages of individuals, bodies,
materials and social machines, semiotic, mathematical, and scientific
machines, etc., which are the true source of enunciation’,[2] then the
critical question for this meeting today becomes how to register
fakeness when meaning is no longer tied to representation but rather the
algorithmic production of subjectivity and the politics of sense and
sensation (or what more frequently goes by the name of affect).[3]
Probing just one component of media-ecological regimes of governance and
control takes us to the operation of algorithms. Governance within the
general ecology of our media condition is orchestrated by algorithmic
calculations of anticipation and pre-emption.

For German media philosopher Erich Hörl, the ‘general ecology’ of the
technosphere analyses the contemporary condition of governance and
cybernetic control in a technical world. Hörl maintains we are in an
‘environmental culture of control that, thanks to the radical
environmental distribution of agency by environmental media
technologies, ranging from sensorial to algorithmic environments, from
bio- to nano- and geotechnologies, renders environmentality visible and
prioritizes it like never before’.[4] Yet environmentality understood as
a new idiom of control is only visible in as much as it manifests on a
scale of perceptible transformation.

If we adopt the paranoid precept that everything is open to inspection,
then our next move would be to ask what, then, is made visible and
knowable? And, who cares? The infrastructural and technical components
of environmental media are more often highly secluded and inaccessible
data facilities, or computational systems operating in the background of
routine transactions, processes and practices. The political question of
power goes beyond a philosophical politics of sense, theory and
concepts.[5] To attribute a politics to such struggles of thought we
would need to identify the institutional and geocultural terrains in
which conceptual dispute is materialized. And that’s when paranoia
begins to set in.

I agree with Hörl that a techno-environmentality paradigm succeeds and
displaces the primacy of human agency and bind of reason. There’s an
embarrassing juvenility that attends the human pretence of control.
Though I would side-line the question of politics as a problem for
theory (‘decision design’) and instead ask how environmental media
relates to the organization and politics of movements. This is a
question I have been addressing with Geert Lovink in our writings on
organized networks (or orgnets) over the last decade or so. In terms of
a program for orgnets operating within these sort of parameters, one
critical question concerns how to organize in ways that are responsive
to new infrastructures of distribution and new agents of power.

A techno-ecology of robots and automation receives a steady stream of
reporting in the mainstream press and tech-magazines. The eradication of
jobs is the common narrative across these reports. The displacement of
the human as the primary agent of change in the world is thus coincident
with the increasing extension of technical environments that manage
social and economic life. Why don’t we switch our attention instead to
architectures of inoperability? One tiny (unknown) disruption and the
robot falls silent – that’s the new certainty of our age, where ‘the
“assembly life” [has] replaced the assembly line’.[6]

With this idea of assembly life in mind, and in pursuit of paranoia as
method, I will now briefly look at security aspects of logistical media
and cloud software services, particularly enterprise resource planning
software (or ERP) used to organize human resources, staff productivity,
student activity and general organizational matters relating to the
management of universities and the optimization of performance.

The worry over backend access is a common one for adopters of ERP
software. SAP, one of the largest developers of enterprise software, are
also known for their backend access to organizational operations. Like
other players in this sector, they justify this on the basis of customer
support services, though it’s not hard to envisage instances where such
access is exploited for purposes of insider trading, jumping trades in
the stock market, etc. I mean, why not?

Microsoft Office 365 claims not to do this: ‘Microsoft builds no back
doors and provides no unfettered governmental access to your data’. But
a well-known feature of enterprise software, including Office 365, is
telemetry, which enables organizations to collect usage data about
documents and software. This is stored in a central database and
accessed via dashboards to provide ‘comprehensive analytical and
reporting capabilities’.[7]

In one tech-vert spruiking the benefits of Office 365 and DLP (data loss
prevention technology), Sean Gallagher – ‘a former Navy officer, systems
administrator, and network systems integrator with 20 years of IT
journalism experience’ – tell us that:

‘Exchange 2013 and Office 365 (O365) include a new feature that can peek
into e-mail messages and enclosed documents and then flag them, forward
them, or block them entirely based on what it finds. This sort of data
loss prevention technology has become increasingly common in corporate
mail systems, but its inclusion as a feature in Office 365’s cloud
service makes it a lot more accessible to organizations that haven’t had
the budget or expertise to monitor the e-mail lives of their employees’.[8]

But really, we already knew that our email was open to inspection, even
before the Snowden leaks. So what are some of the core problematics we
face as researchers, teachers, artists and activists when it comes to
the digital production of knowledge? And how does the question of
fakeness play into this? One key issue at stake here is epistemological,
the other is infrastructural. Both are political.

As Noortje Marres observes in her recently published book, Digital
Sociology:

‘… when social researchers take up online instruments of data
collection, analysis and visualization they enter into highly troubling
relations of dependency with the infrastructures and organizations that
make them available. As social researchers take up online tools, we too
sign up to the terms of use stipulated by digital industries, whether we
are aware of it or not’.[9]

So what’s to be afraid of here? Data extraction and financialization are
central to the economies spawned by digital infrastructures of
communication. Noortje’s focus is on the ethical implications that
attend the generation of data and knowledge from online tools that are
integrated with technologies of capture that seek to extract value from
populations under scrutiny. There is also the political question of how
to organize in ways not dependent on the digital infrastructures of
platform capitalism. But who’s got a plan? Over the past decade the
geopolitical shift to global markets and centres in East Asia has
impacted enormously on the economic and social fabric enjoyed in North
America, Australia and Europe for a few decades following World War 2.
With new technologies of automation now impacting employment prospects
across the world, what happens when 20, 40, 60% of the population is
written off, without a job, and sliding into a life of destitution below
the poverty line? Democracy as an orchestrated ensemble of the elites
falls apart. Even the seeming stability of authoritarian capitalism in
countries like China will rapidly struggle to govern populations in
conditions of mass crisis.

The creation of new institutions will only happen once the old ones have
gone. Foucault’s criticism of revolution was that inevitably the new
guard simply end up occupying the warmed up seats of the old guard. ‘In
order to be able to fight a State which is more than just a government,
the revolutionary movement must possess equivalent politico-military
forces and hence must constitute itself as a party, organised internally
in the same way as a State apparatus with the same mechanisms of
hierarchies and organisation of powers. This consequence is heavy with
significance’.[10] While an element of structural determinism lurks
within Foucault’s response to his Marxist interlocutors, his statement
nonetheless invites the question: what is the difference between
revolution (as a reproduction of the same) and taking control of the
infrastructures of those in power? Neither result in an invention of new
institutional forms. When movements organize as a party the possibility
of alternatives is extinguished. This is the brilliance of Foucault’s
analysis, and a position that Jodi Dean reproduces in her valorization
of the party as the primary vehicle for political articulation.[11] In
both cases, however, there is nowhere left for radical politics within
organizational apparatuses of equivalence.

The issue I raised earlier around the correlation between
neo-positivism, data analytics and the epistemological status of
knowledge as either fake or true also requires a bit more fleshing out.
We have invested so much epistemological weight in the power of numbers
and the calculability of things that fake power is now super-hegemonic,
it is the norm, and this was years before Trump came on the scene. Much
of what counts in assessments of research impact rests on the ability to
persuasively mobilize statements supported by statistics. Preferably a
researcher is able to justify their claims with reference to rankings
and citation statistics produced by the major commercial entities who
confer legitimacy upon university declarations of quality and excellence.

What, then, are some techniques and tactics we might deploy to combat
the regime of fakes that command and insist authority over the world, in
our jobs and over our lives. How do we tell fake power to fuck off? As
bitcoin critic Brett Scott recently tweeted:

Suitpossum
The world is not data. The world is soil, sun, water, bodies,
communities, sweat & oil. Data is an echo of these. It is not ‘the new oil’.
22 May 2017, 10:29 pm

Of course Scott is right to remind us that the spectrum of life cannot
be subsumed by technologies of metricization. There is indeed life
beyond data economies and parametric architectures. But, regrettably,
data is the new oil. So the trick is to learn how to live with it. One
strategy is to raise the stakes of the fake. This would be a
Baudrillardian gesture, I guess: amplify the fake and foreground the
limits of phoney regimes of governance and control by showing how
all-too-real they are. There is a long history in theatre and
performance that undertakes this work and we have a prime example of
that with us today in Simon Hunt’s anti-hero of Pauline Pantsdown. The
Yes Men would be another. Years earlier, renegade philosophers,
pranksters, artists and activists associated with Guy Debord and the
Situationists were among the many who belong to a tradition of
unsettling perceptions of the given.

In search of antecedents for paranoid methods, one exemplary cinematic
text is John Frankenheimer’s The Manchurian Candidate (1962). Featuring
a remarkable Frank Sinatra who embodied so well the disturbing intensity
of the paranoid subject, Wikipedia tells us this ‘neo-noir Cold War
suspense thriller … concerns the brainwashing of the son of a prominent
right-wing political family, who becomes an unwitting assassin in an
international communist conspiracy’.[12] The film navigates the tension
between refusal and capture, between situation awareness and the clawing
intuition that things are not what they seem, but you’re not really sure
why. The latter most closely approximates paranoia as method.

We know, or at least are told often enough, that algorithms increasingly
govern our encounter with the world. But most of us have no idea how
they do that, nor the extent to which our tastes and predilections, our
desires and fantasies are shaped by machinic operations devised by some
sweaty palmed nerd strapped to his console. At least that’s the general
imaginary we draw on to explain alienation in the age of algorithmic
control.

Paranoia need not be treated exclusively as a personality disorder. In
the assessment of social normativity, disorders of many kinds are
situated on the edges of bell curves that index the distribution of
personas. But rather than cage paranoia as a condition of psychotic
illness, self-grandeur, conspiratorial fears or, as Burrough’s put it,
‘delusions that your enemy is organized’, my preference is to unleash
paranoia as a widespread sensation of impending disaster. How might we
‘program the sensory order’?, as McLuhan wrote in his review of
Burroughs’ Naked Lunch and ‘the new electric environment’.[13] Here’s
McLuhan’s elaboration:

‘The central theme of Naked Lunch is the strategy of by-passing the new
electric environment by becoming an environment oneself. The moment one
achieves this environmental state all things and people are submitted to
you to be processed. Whether a man takes the road of junk or the road of
art, the entire world must submit to his processing. The world becomes
his “content”. He programs the sensory order’.

The idea of ‘reprogramming’ the sensory order is not something new to do
with code and scripting, but is fundamentally about repetitive and
ritualistic exposure of self/others to the same variables over time. It
is a cybernetic operation that lies at the core of human society and the
technics of modelling the world in ways that produce sensory regimes
specific to the arrangement of technical devices, social systems and the
generative force of contingency. The exploration of sensory order is a
topic of investigation for many artists. They produce environments in
which the technics of perception and experience, sense and sensation are
tested in ways that signal the media-technological horizon of the
future-present.

Before moving to a conclusion I would like to briefly survey the work of
a few artists engaging the paranoid logic that underscores contemporary
modes of orchestrating experience. Some of these works take us back to
the fundamentals of vision. Light in James Turrell’s work, for instance,
is explored for its properties – not as that which illuminates things,
but for the thingness or spectral properties of light itself.[14] The
earlier work of Olafur Eliasson, which is about ‘seeing yourself
seeing’, explores a similar theme of over-exposure.[15] Examples such as
these prompt us to ask how the quality of light produces regimes of
vision that inflect knowledge within a particular spectrum.

Other works such as Sophie Calle’s The Detective (1980) have a kind of
Douglas Sirk quality to them, where an interior world of daily routine
is peeled open to expose the banal melodrama of suburban life.[16] Vitto
Acconci’s Following Piece from 1969 explores a similar theme, as do
countless films of suspecting wives and cheating husbands (or cheating
wives and jealous husbands).[17] In the case of Calle’s work, she asks
her mother to hire a detective to report on Calle’s daily activities,
providing photographic evidence of her existence. The artwork consists
of a series of photographs taken of Calle in the street, in a park, at a
café, and so on. The photographs are accompanied by a ledger reporting
briefly of both the detective’s and Calle’s activities across the hours
of the day. We read that at 8pm ‘The subject returns home. The
surveillance ends’. Unbeknown to the detective, Calle has requested that
François M., a friend or acquaintance of Calle’s, wait outside the
Palais de la Découverte at 5pm and follow whoever appears to follow Calle.

The artwork ends with a series of pictures of what is presumably the
detective, camera in hand, and a short note reporting on what François
has observed. This recursive instance is designed to reassure the viewer
that the staging of Calle’s documentation by a detective really did
happen. But it also has the affect of reiterating that the entire work
may also be an exercise in the production of fake truth. What we read
and see on display might just as well be a demonstration of expectations
vis-à-vis the fidelity of convention with regard to the genre of
detection and surveillance. The work is also highly media specific.
Today the paradigm of control correlates more approximately to an
algorithmic imaginary of the NSA surveillance machine that penetrates
the depths of code to punish subjects who don’t conform.

The repetition of experience, action, documentation and deduction across
these various works has an algorithmic dimension in as much as
algorithms are also repeatable routines executed with consistency over
time. As Tarleton Gillespie reminds us, the term algorithm for software
engineers ‘refers specifically to the logical series of steps for
organizing and acting on a body of data to quickly achieve a desired
outcome’.[18] While there is often nothing particularly quick about
decision making within government institutions, the idea of governance
beyond the state would, I think, overlap considerably with this
computational definition of algorithms.

The fake news distributed through contemporary digital meme culture
holds a temporality of the instant. The aesthetic keys in the works of
Acconci and Calle register a mode of distribution with considerably
longer duration. So what am I trying to extract from these various
accounts of cultural production is the manner in which *media determine
our situation*. The temporality of the signal/message/reception ratio is
stretched, even if the spatial distribution is far more contained within
the circuit of the art system and its economy. Yet the eleven year
interval between the works of Acconci and Calle also extends into the
time and space of Hollywood’s dream machine and then again into other
world cinemas exploring noir themes of paranoia and self-inspection. We
could also carry this over to the cultural industry of pulp fiction.

In a way fake news has no regard for scale anyway: the so-called
intention to mislead through the cultivation of post-truth truths is
often enough an exercise in self-affirmation for individuals,
communities and populations. Whether this happens for one person or one
hundred million people is perhaps beside the point since both the effect
and affect are the same: the yearning for imaginaries of security in
world underscored by chaos and destruction.



Notes
1 Philip Mirowski, Never Let a Serious Crisis Go To Waste: How
Neoliberalism Survived the Financial Meltdown (London and New York:
Verso, 2013), 331.

2 Lazzarato’s Signs and Machines quoting Guattari’s Machinic Unconsious.
Quoted by Erich Hörl, ‘Introduction to General Ecology’, in Erich Hörl
with James Burton (eds), General Ecology: A New Ecological Paradigm
(London: Bloomsbury Academic, 2017), 16.

3 Cf. Hörl, 15.

4 Ibid., 9.

5 Ibid., 14.

6 Sylvere Lotringer, ‘Better than Life’, Artforum International 41
(April, 2003): 194-197, 252-253.

7
http://searchenterprisedesktop.techtarget.com/tip/Office-Telemetry-reports-on-Office-2013-docs-and-apps


8 Sean Gallagher, ‘Trigger word: E-mail monitoring gets easy in Office
365, Exchange’, Ars Technica, 3 May 2013,
https://arstechnica.com/information-technology/2013/03/trigger-word-e-mail-monitoring-gets-easy-in-office-365-exchange/


9 Noortje Marres, Digital Sociology (Cambridge: Polity Press, 2017), 182.

10 Michel Foucault, ‘Body/Power’, in Power/Knowledge: Selected
Interviews and Other Writings 1972-1977, edited by Colin Gordon, trans.
Colin Gordon, Leo Marshall, John Mepham and Kate Soper (New York:
Pantheon Books, 1972), 55-62.

11 See Jodi Dean, Crowds and Party (New York: Verso, 2016).

12 https://en.wikipedia.org/wiki/The_Manchurian_Candidate_(1962_film)

13 Marshall McLuhan, ‘Notes on Burroughs’, Nation Magazine, 28 December
1964, 517.

14
http://www.abc.net.au/news/2014-12-11/james-turrell-exhibition-opening-at-national-gallery/5961876

15 Your Blue After Image Exposed, 2000,
http://olafureliasson.net/archive/artwork/WEK101381/your-blue-afterimage-exposed
and Feelings are Facts, 2010,
http://olafureliasson.net/archive/exhibition/EXH101122/olafur-eliasson-and-ma-yansong-feelings-are-facts#slideshow


16
http://scottbankert.net/UrbanArtsParis/Readings/Additional%20Readings/Sophie%20Calle_The%20Detective.pdf

17
https://www.khanacademy.org/humanities/global-culture/conceptual-performance/a/vito-acconci-following-piece

18 Tarleton Gillespie, ‘Algorithm’, in Benjamin Peters (ed.), Digital
Keywords: A Vocabulary  of Information Society & Culture (Princeton:
Princeton University Press, 2016), 19.







More information about the Fibreculture mailing list