LIVING AND THINKING IN THE POSTDIGITAL WORLD
Living and Thinking in the Postdigital World is the result
of a series of conferences organized at the Collegium
Artes Liberales, University of Warsaw, as a part of the
project “Technology and Socialization”. Its main aim is to
interrogate the different ways in which technology –
especially digital technology – shapes today’s social and
political landscape in a theoretical and practical way.
The book is divided into three parts. The first one
concentrates on theoretical elaborations of our current
situation – testing if theories of technology that we have
inherited from earlier ages are suited to our current historical moment. The second part of the book is devoted
to describing novel experiences allowed by digital technologies and the intertwinement between our “online”
and “offline” lives. The chapters gathered in the final
part endeavor a look into the future, problematizing the
consequences of currently observable trends and trying
to understand the workings behind various visions of
what is to come.
LIVING
AND
THINKING
IN THE
POSTDIGITAL
WORLD
THEORIES
EXPERIENCES
EXPLORATIONS
Edited by
Szymon Wróbel
Krzysztof Skonieczny
Książka dostępna
także jako e-book
www.universitas.com.pl
universitas
LIVING
AND
THINKING
IN THE
POSTDIGITAL
WORLD
LIVING
AND
THINKING
IN THE
POSTDIGITAL
WORLD
THEORIES
EXPERIENCES
EXPLORATIONS
Edited by
Szymon Wróbel
Krzysztof Skonieczny
Kraków
Publikacja dofinansowana przez Wydział„Artes Liberales”
Uniwersytetu Warszawskiego
Publication co-financed by the Faculty of „Artes Liberales”,
University of Warsaw
© Copyright for this edition by Towarzystwo Autorów
i Wydawców Prac Naukowych UNIVERSITAS
& University of Warsaw – Faculty of „Artes Liberales”, Kraków 2021
ISBN 978-83-242-3690-9
e-ISBN 978-83-242-6537-4
TAiWPN UNIVERSITAS
Reviews
Marta Bucholc, Assistant Professor, University of Warsaw
Aleksandra Derra, Professor of Nicolaus Copernicus University in Toruń
Proofreading
Anna Olechowski
Editing
Magdalena Kusak
Typesetting
Stanisław Tuchołka / panbook.pl
Cover design
Sepielak
Photography
Michael Gaida, Pixabay
www.universitas.com.pl
Contents
Szymon Wróbel, Krzysztof Skonieczny, Introduction . . . . . . . . . . .
7
Part One. Theorizing the Technological Present . . . . . . . . . . . . . . .
13
Szymon Wróbel, Dismantling the Concept of Technology . . . . . . . . . .
15
Ivan Dimitrijević, Judgement upon Work . . . . . . . . . . . . . . . . . . . . . .
35
Michael Stemerowicz, “The Work of Art in the Age of Its Technological
Reproducibility” – A Rehearsal of Consumerism and the Aesthetic
Consequences of the Dissolution of Tradition . . . . . . . . . . . . . . . . . .
57
Mümtaz Murat Kök, Optimism as Attachment to Capitalism . . . . . .
77
Adam Lipszyc, Affect Unchained: Violence, Voyeurism, and Affection
in the Art of Quentin Tarantino . . . . . . . . . . . . . . . . . . . . . . . . . . .
91
Part Two. Experiencing the Technological Present . . . . . . . . . . . .
109
Michaela Fišerová, Touching and Retouching. Question of Authenticity
in Social Networking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Agata Szepe, Israeli Tourism to Poland in Social Media.
Perspectives of Social Science of the Internet
and the Actor-Network Theory Approach . . . . . . . . . . . . . . . . . . . . .
127
James W. Besse, Designing Emotional Styles . . . . . . . . . . . . . . . . . . .
141
Joanna Łapińska, Posthuman and Post-Cinematic Affect
in ASMR “Fixing You” Videos . . . . . . . . . . . . . . . . . . . . . . . . . . . .
153
6
Contents
Julia Krzesicka, Emotional Capitalism, Sociability, and Orality.
Speculative Imagination Exercise on the Future of Work
for Voice Assistants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
169
Karin-Ulrike Nennstiel, A Social Robot in Your Living(room)?
Recent Developments and Shifting Appraisals . . . . . . . . . . . . . . . . .
185
Part Three. Exploring Technological Futures . . . . . . . . . . . . . . . . .
203
Krzysztof Skonieczny, Deleuze’s Remarks on Control Societies.
Consequences for Work and Education . . . . . . . . . . . . . . . . . . . . . .
205
Denis Petrina, Affect Trapped: Algorithms, Control,
Biopolitical Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
219
Ewa Mazierska, Representation of Postdigital Encounters
in Recent Science Fiction Films . . . . . . . . . . . . . . . . . . . . . . . . . . .
235
Adam Cichoń, Dr. Strange(love) or:
From Affection-Images to Inter-Faces . . . . . . . . . . . . . . . . . . . . . . .
253
Mitchell Atkinson III, Substrate Independence, Migration,
and the Naturalistic Attitude . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
273
About the authors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
293
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
299
Denis Petrina
Affect Trapped:
Algorithms, Control, Biopolitical Security
ABSTRACT
In 1992, Gilles Deleuze established a frame to analyze new taxonomies of
power, based less on disciplinary power than on “free-floating” control. Technologically, this novel society of control breaks from disciplinary societies that
mainly relied on energy-producing mechanisms, employing much more complex machines – computers. This paper argues that the shift from machines
to computers is not merely a technologically induced reconfiguration of the
socioeconomic field but, more importantly, a biopolitical one, for it establishes
a new regime that targets, governs, and controls subjects, their minds and
bodies – that is, their affects. In the digital environment, the latter is captured
and transformed by algorithms – the operational principle of the machines
mentioned above. Thus, this paper aims to examine a biopolitical framing of
algorithm-driven media closely, discuss how algorithms function as biopower
mechanisms, and, finally, raise a question whether (and how) it would be possible to resist becoming algorithmic biopower.
KEYWORDS:
algorithms, affect, control, apparatus of security, biopolitics
Introduction: From Algorithms to Power
Recently, “algorithm” has become a buzzword. From scientific disciplines (cybernetics, medicine, economics) to business environments,
today algorithms have become utterly omnipresent as a procedure
(for lack of a better term), integrated into systems, for data extraction and processing, calculation and analysis, the indication of problems and their solutions, and a broad spectrum of other functions.
Bernard Chazelle, a prominent computer scientist, has even termed
algorithms “the idiom of modern science”, placing particular importance on understanding how algorithms work, as sciences do
220
Denis Petrina
and perhaps will change, while, Chazelle believes, algorithms are to
stay (Chazelle 2006). The innovative impetus of algorithms, as their
numerous applications showcase, lies, quite simply, in their independence from the human operator (which has been translated into
terms of [artificial] intelligence), and thus in their full automation,
high performance, and adaptability. More importantly, algorithms
do what human beings cannot do or do very poorly: they are programmed and are, therefore, fully capable of operating in the conditions of uncertainty, estimating likelihood and making “intelligent”
decisions based on the input data they receive.
Attempting to theorize the features of algorithms and place
them in a social context, the automated (even autonomous?) nature of algorithms signifies a conclusive break from the so-called
“Fordist socioeconomic model”. It symbolizes, if not embodies, the
post-Fordist logic of (economic, social, and cultural) production.
While the former is conventionally associated with standardization,
mass production, and consumption – in short, homogeneity – the
latter is typically described as far less standardized and more flexible,
characterized by economies of scope and, as a result, as inherently
heterogeneous and dynamic. We might even dare to say that this
was brought up by the algorithmization of radical reorientation from
the rigid Fordist culture towards less regulated and more diversified
post-Fordist environment (both material and digital) that allows (or
at least promises) us to escape from the disciplinary apparatus of
work, restrictive society, and even the dominating model of thinking. But does it, actually?
Epistemologically and practically, as discussed above, algorithms
have, indeed, been heralded as a breakthrough in the way we think
about things and do them. However, is this breakthrough as emancipatory (again, epistemologically, practically, and even politically)
as apologists of algorithms tend to view it? While obviously restraining from technological pessimism, I tend to view algorithms, as well
as their effect on contemporary culture, critically – and offer a more
nuanced and problematic philosophical perspective to look at them.
If we were to (re-)read Gilles Deleuze’s (1992) Postscript on the Societies
of Control, we could not help but notice how, upon closer inspection, algorithms staggeringly resemble a new regime of power that
Deleuze grasps conceptually. Also, if we were to pay closer attention
Affect Trapped: Algorithms, Control, Biopolitical Security
to how Deleuze frames the discussion of the evolution of taxonomies
of power, we would be able to discern the crucial role of technologies
in the development of the socioeconomics of control.
Indeed, algorithms are not mechanic entities, they are not machines, and they are not hardware that is representative of the Fordist
socioeconomy; instead, they represent the principle of operation – in
other words, they are software. This, again, makes us rethink not only
practical techne – how things are done; but also, most importantly, political techne – which tools power utilizes. Here, two conclusions are
possible. The first one is that the shift from hardware – machines –
towards software – principles – is emblematic of what, following
Michel Foucault, could be called “vaporization of power”: power becoming less visible, less tangible, and, generally, less perturbing. This
leads to the second, undoubtedly naïve, conclusion that vaporization
of power is tantamount to the vanishing of power. Quite the contrary,
Foucault’s lesson is that the vaporization of power means nothing but
the fact that power in contemporary societies is everywhere – it is the
very air we breathe (Foucault 1995, 26). As insightfully marked by
Bruce Curtis, decentralization (qua already discussed “vaporization”)
of sovereign power does not amount to the absolute disappearance of
power – in fact, it is an extension of what Foucault terms “governmentality”, a macabre combination of government and rationality
(Curtis 2002, 506).
This remark leads us to another critically important Foucault’s
notion – “biopolitics”. For Foucault, the biopolitical regime (which he
links with neoliberal rationality) is not governmental in its traditional
sense, but rather environmental: it is not the disciplinary regime that
seeks to target individuals, drill and punish them, but a much more
complex ecology of powers (Foucault 2003, 246). This paper attempts to
argue that algorithms and, consequently, algorithmic rationality are,
indeed, an integral part of this new ecology of powers marking the
emergence of the societies of control. Thus, it will offer a closer examination of the latter and will argue that control is algorithm-driven
per se. It will also reveal the links between the biopolitical apparatus
of security and algorithmic governmentality/environmentality whose
target, as it will be explicated further, appears to be an “affect”, cognitive and physical bodily capacities. Ultimately, potential scenarios
of resistance (affect-invested and affect-driven) will be discussed.
221
222
Denis Petrina
From Discipline to Control
Following Patricia Clough, the break between the ancien régime and
nouveau régime could be conceptualized as a transition from discipline and representation towards control and information (Clough
2007, 14). Deleuze’s almost prophetic text, written in 1990, Postscript on the Societies of Control, outlines the genealogical development
of control, tracing it back to disciplinary societies, which, in their
turn, developed from sovereign societies. In his essay, Deleuze develops a detailed analytical framework for outlining essential differences between two sociopolitical regimes, focusing on such aspects
as (1) spatial environments, (2) socioeconomic models, (3) dynamics,
(4) subjectivity, (5) scientific implications, (6) machines, all of which
are discussed further.
As Postscript on the Societies of Control attempts to expand Foucault’s
critical historiography (genealogy) and at the same time digresses
from it, the essay is deeply informed by Foucauldian theory, borrows
vocabulary from it, interprets, and reinterprets it. For instance, when
discussing disciplinary societies, Deleuze associates them with “environments of enclosure”, such as a prison, hospital, factory, school,
and family (Deleuze 1992, 4). This topology of enclosure, as Deleuze
explains it and as Foucault elaborates in his writings, is tightly linked
with institutions, in which an individual performs a certain role
(a student in school, a worker in a factory, etc.), and could, therefore,
be characterized by breaks and interruptions – in short, stratification.
The latter, however, should be construed as a “repertoire” of identities:
again, the student, the worker, the prisoner, which are more or less
similar for all individuals placed in the same context. That is why
Deleuze argues that the socioeconomic model of Fordist disciplinary
societies is “a factory” (Deleuze 1992, 5), which, according to Foucault, presupposes a Panoptic logic of close supervision and tireless
control exercised upon agents, each of whom is an integral part of
a broader mechanism and performs their functions in accordance with
strict regulations and norms (Foucault 1995, 174-175).
Seen in this light, disciplinary societies rely heavily on the subjectivity; yet, again, not single individuals, but rather, as Deleuze explains, “an individual within a mass” (Deleuze 1992, 5). Even though
individual behavior is supervised, tracked, corrected, and controlled,
Affect Trapped: Algorithms, Control, Biopolitical Security
it matters only as a means that makes extrapolation possible: that is, it
allows to establish a norm, which becomes a correlative of productivity in economy and discipline in society. Since the nexus between
productivity and discipline is incorporated in almost every sphere of
life in disciplinary societies and is inseparable from them, the epistemological framework of these could be named as “mechanics”, and
more precisely, “thermodynamics”. Put simply, the object of these
disciplines is the bodies and energy they produce; discipline is, indeed, concerned with the production of (Foucault would call them
“docile”) bodies as well as the reproduction of them, while productivity directly correlates with extraction, conversion, and distribution
of energy among different parts to make the whole productive. Thus,
these elements, all together, constitute “capitalism of production”
(Deleuze 1992, 6).
In societies of control, a Fordist model of “capitalism of production” is substituted for “capitalism of marketing” (the utmost form
of commodity fetishism, in which commodities become more “alive”
than humans), which Deleuze himself calls “the soul of the corporation” (Deleuze 1992, 6). The corporation with a soul is the new socioeconomic model that diverges from the “heartless” factory: unlike
the latter, the former does not depend upon institutions (in fact, it
signifies the end of institutions), and instead of relying on various
environments of enclosure, creates and fosters an open environment,
where, as Deleuze notices, control is “free-floating” (Deleuze 1992, 6).
It is therefore characterized by perpetuality and continuousness,
which contribute to its reflexivity – as Scott Lash explains, this
metaphysical (rather than its predecessor, physical) form of capitalism is best represented by cybernetic systems that are reflexive
and self-organizing due to their openness to the environment (Lash
2007, 8). For this reason, physics, mechanics, and thermodynamics
are all insufficient epistemological frameworks for understanding
control societies, in which corporations have souls and capitalism
is metaphysical. Extracting and transmitting information as well as
developing and regulating networks are two practices that constitute
the backbone of control societies, which makes cybernetics their prototypical, epistemological paradigm.
Not only does the cybernetic nature of control societies weaken
institutional power, but it also discards the product of institutions –
223
224
Denis Petrina
individuals and masses as a form of subjectivity. The question of
subjectivity is highly problematic in the context of metaphysical/cybernetic/control-based capitalism: who is the subject if all
that we know about them is the scattered segments of their data?
Deleuze skillfully answers this question by proclaiming the notion
of the “individual” as obsolete (or, to be more precise, attributing it
solely to disciplinary societies) and insisting that we have become
“dividuals” instead – a collective concept that encompasses “masses,
samples, data, markets, or banks” (Deleuze 1992, 5) – in short, data.
To reiterate this, the data is the subject (and vice versa); this statement
radically reshapes the philosophical understanding of the subject,
traditionally understood as a coherent and unified entity, and proposes the alternative concept of the subject, which, following John
Cheney-Lippold, might be described as an “algorithmic identity,
[which is] an identity formation that works through mathematical
algorithms to infer categories of identity on otherwise anonymous
being” (Cheney-Lippold 2011, 165).
From Control to Computers (and Back)
As demonstrated, one of the core aspects of the Deleuzian philosophical methodology is what might be termed “diagramization” of
social relations and their underlying principles. The diagram, however, should not be conceived as a visual representation of a certain
aspect of the complexity of the social assemblage (for instance, “the
factory” is not the representation of the actual manufacturing facility), but rather, as Manuel De Landa helpfully explains, as a complex
system that accounts for virtual singularities (De Landa 2000, 33).
Alexander Galloway maintains:
just as the fold was Deleuze’s diagram for the modern subject of the
baroque period, the superfold is the new “active mechanism” for life
within computerized control society. (Galloway 2012, 524)
And even more so: to understand the control society we live in,
it is imperative to recognize the “computer as its central mitigating
factor” (Galloway 2012, 524). In other words, the computer – qua
the superfold – is the diagram of control societies.
Affect Trapped: Algorithms, Control, Biopolitical Security
Its genealogic predecessor is any type of machine that involves energy (again, the focus is placed on productivity and thermodynamic),
which Deleuze associates with disciplinary societies. The danger for
these machines takes the form of entropy, which potentially disrupts the order of disciplinary societies (Deleuze 1992, 6). Interestingly enough, in the machinic diagram of control societies, the
computer operates in the environment of entropy, cybernetic chaos,
and abundance of structureless information. Unlike energy-driven
machines, which are closed and in this regard rather simple in terms
of their structure, computers are complex: their complexity comes
from the order within which they operate; while capitalism of production qua Fordist capitalism qua disciplinary societies rely on easily
identifiable entities: materials, property, products, computerized societies of control produce and reproduce “coded figures” that are inherently “deformable and transformable” (Deleuze 1992, 6). In short,
computers are the very emblem of the shift from the danger of entropy towards systematic, productive, and the alarming exploitation
of information disorder.
We might, however, ask a reasonable question: what is the relation
between computers and control? Perhaps to answer it, we need to
determine what Deleuze actually means by the illusive term “control”.
Galloway reminds us that English and French meanings of the word
“control” differ: while being rather definite in English, in French,
control presupposes certain “monitoring apparatuses, such as train
turnstiles, border crossings and check points” (Galloway 2012, 522),
which, in their turn, let us associate control with such characteristics, as mobility, openness, and (controlled) liberation. In this sense,
control is not as much political as it is biopolitical – and, as Clough
argues, digital media and computational technologies are, indeed,
deeply embedded in biopolitics (Clough 2018, xviii).
From Computers to Apparatuses
of Security (Algorithms)
Foucault’s account of biopolitics mirrors many aspects of control societies, thoroughly examined by Deleuze. Perhaps it would be safe to
say that Foucauldian biopolitics is control-oriented as much as control
societies are biopolitical. The first reason to consider them together
225
226
Denis Petrina
is that both the biopolitical regime and societies of control emerge
as a response to a problem of entropy. Eugene Thacker notes that
biopolitics as a mode of governmentality appeared when governments
became preoccupied with the question of control – more precisely,
the management and regulation of multiplicities; “and the problem
is greater when the multiplicities in question are construed as living
multiplicities”, he writes (Thacker 2011, 152; the emphasis is his).
Thus, the object of power undergoes radical transformation – if disciplinary regimes attempted to govern individual bodies, biopolitics
targets whole populations (“living multiplicities”).
Understandably, these multiplicities could be encountered by
governmental apparatuses only through a means of representation:
in case of populations, statistical representation – which, translated
into the vocabulary of societies of control, is nothing but raw data.
This data represents “not the actual totality of the subjects in every
single detail, but the population with its specific phenomena and
processes” (Foucault 2009, 93). Obviously, it is impossible to govern
or discipline “phenomena and processes”, yet it is possible to control
them: and here, control is synonymous with security, or, to be more
precise, the apparatus of security. Governmentality – the art of governing – is of utmost importance here: for security and control tend
to mask themselves, they appear as a modulation of laissez-faire or not
governing too much (Foucault 2008, 102) – and should thus modify
the very environment they operate in to be efficient. This makes security and control not only governmental but, even more importantly,
environmental, since, as Brian Massumi explains, it regulates causes
rather than effects (Massumi 2015, 22).
Therefore, the old principle of normation (exemplary of disciplinary societies) does not apply here; the proper term here seems
to be “normalization” (Foucault 2008, 146). The difference lies in
the strategy how techniques and technologies of power are applied:
in disciplinary societies, problems (delinquency, poverty, epidemics)
were solved by correcting problematic individuals/bodies; in biopolitical societies of control, however, eradication of problems is not an
end in itself: instead, the limit is set that makes the problem tolerable. “Optimality” is another word for the principle of laissez-faire:
its conceptual formula is “neither too much nor not enough” – just
the right amount. The logic of normalization is algorithmic per se,
Affect Trapped: Algorithms, Control, Biopolitical Security
as it is concerned with statistically-derived mathematical optimization (the selection of the most representative and relevant elements
from sets of data), risk assessment and mitigation (not elimination
though), and, ultimately, constant re-evaluation, semi-automated
learning, and modeling.
According to Maurizio Lazzarato, the contemporary regime is
comprised of three major trajectories of governing, each of which
targets a specific domain of social and personal life. This triad includes three “Ms”: management of population (biopower) through the
molding of the individual (discipline) and modulation of mind (control)
(Lazzarato 2006, 176-177). Algorithms are their point of intersection: as discussed above, biopolitical management of populations is
algorithmic in nature, as it heavily relies on representational data
and is constantly transformed by it; discipline allows the apparatus
of security to extract the necessary data from bodies and individuals
and in so doing “fuels” the apparatus; and, finally, control is technically enabled by computers where data is circulated. The digital data
becomes “the very ‘texture’ of capitalism” and results in the mutation
of capitalism, already discussed by Deleuze: capitalism of production
transforms into what Antoinette Rouvroy terms “algorithmic governmentality” (Rouvroy 2016, 30-31).
Such governmentality largely exploits what, after Clough, might
be called the “user unconscious”: a strange blend of subjective unconscious mechanisms commonly shared by users of a specific medium
which produce “technical substrates” (Clough 2018, x). Put simply,
users are unaware of the functional mechanism behind the technical
facilities they use, which in algorithm studies has been conceptualized as the problem of “the black box”: the problem of opaqueness,
which prevents us from knowing why an input X produces an output Y. Viewed through this problematic lens, the algorithmic principle of operation of biopower could also mean that algorithms are
potentially biopowerful. If this is the case, it begs the question: how
precisely are algorithms utilized as exploitative tools of (bio)power?
From Algorithms to Affects
Tarleton Gillespie, a principal researcher at Microsoft, defines algorithms as follows:
227
228
Denis Petrina
they are encoded procedures for transforming input data into a desired
output, based on specified calculations. The procedures name both
a problem and the steps by which it should be solved. (Gillespie 2014, 167)
Semantically, this definition omits a crucial component for a fuller
understanding of how algorithms operate: the agent. Who is the
agent encoding procedures? More importantly, what is a “desired
output” (desired by whom?) and who decides what the “problem”
that has to be solved is? Even though the clarity of the description
of procedures achieved by the use of the passive voice signifies that
algorithms are technologically neutral (that is, this technology is not
“evil” per se), this description suggests that basically anyone (provided
they have the necessary knowledge and skills) can be the agent. And,
taking into account the fact that biopolitics is algorithmic, it becomes
apparent that algorithms are not politically neutral.
The seme of “calculation” reveals that algorithms are perfectly
suitable tools for biopower, since the latter strives, as it has been
discussed, to set limits, establish optimal conditions, evaluate and
prevent losses, optimize costs, etc. What is more, both mathematical algorithms and statistical biopower target the same object: data.
However, what is data? When does a body, a living being, become
data? What mediates technologically intelligible data – or, more
precisely, big data, large sets produced by numerous populations
data – and living multiplicities? The answer, however paradoxical it
may seem, is affects – intertwined cognitive and bodily states, whose
derivatives are feelings, moods, emotions on the one hand and capacities, actions, and interactions on the other. Clough explains that
“under datalogical conditions, measurement is always a singularity –
a productive, affective materialization of dynamics and relations of
recombinable forces” (Clough 2018, 108). And yet, even though the
singularity of an affect – a particular action/reaction of a particular
individual – is treated as a singularity, it is then subsumed under
broader categories representative of the whole population, which,
indeed, makes it algorithmic.
To clarify, algorithms operate under the conditions of uncertainty. Affects are, indeed, characterized by uncertainty: there is
no way to figure out how a body would react to a certain trigger
and what the outcome would be. Or is there? If the modus operandi
Affect Trapped: Algorithms, Control, Biopolitical Security
of biopolitical control is the triad of risk evaluation, limitation, and
optimization, and if algorithms are procedures that are capable
of “transforming input data [no matter how ‘uncertain’ it is] into
a desired output” (Gillespie 2014, 167), this task does not seem
unsolvable whatsoever. If the body (mediated through affect and
then through data) does not react to a trigger in a desired manner,
it can be trained to react so. It is important to understand, however,
that algorithms are not magical “hypodermic needles”: they are not
“injected” directly into the bodies or brains to produce “the desired
outcome”; obviously, the process is not as linear as we might imagine
and much more complex.
Biopower 2.0 operates in the middle space, as Cheney-Lippold,
following Foucault, explains: this is a peculiar type of space, “where
the modulation of categorical identity changes according to a ‘culture,
imperceptibly deviating from the empirical orders prescribed for it by
its primary codes’ (Foucault 1973, xx)” (Cheney-Lippold 2011, 174).
This is the type of operation that Clough has revealed: a “black box”
of recombination, in which “affective circulation” (Swenson 2011, 20)
takes place: affects are captured, encoded as intelligible data, and
then this data is used to produce a desired output – or desired affects.
Ultimately, the very apparatus (algorithm) is changed: since, as it
has been continuously mentioned, it is automated and adaptable,
not only does it know (qua is programmed) how to capture needed
affects and (en/de)code them, but it also learns from the coded data
and transforms itself to better function in the conditions of uncertainty. Gillespie, too, has emphasized that algorithms can be trained
to “identify qualities within the data that might help it to discern
different types of content” (Gillespie 2018, 216). Different types of
content are then categorized; Cheney-Lippold views such categorization as a creation of digital “ontologies (…) embedded within a set
of power relations” (Cheney-Lippold 2011, 174).
Further, he argues that this new form of biopower regulates how
these digitally constructed quasi-ontological categories define life
(Cheney-Lippold 2011, 175) as a continuum of affective actions and
interactions. Viewed broadly, affective investments (governed by
algorithms) become affective re-investments; for example, collected
from users marketing-related data exposes hedonic aspects of human
life, such as interests, tastes, even desires, which are then shaped
229
230
Denis Petrina
and reshaped on the basis of the interaction with the algorithms.
Purchase data may be used to determine a user’s economic status,
as well as directly influence their decisions to buy (again, thanks to
the algorithmically constructed “category” for the population this
user potentially represents; if a person does not make a purchase,
algorithms will figure out the reason of such a decision and will
therefore adjust). Even more so, algorithms, in a very worrying
biopolitical manner, have direct access to individuals’ bodies: all
types of wearables, heart rate trackers, and similar types of smart
technology process myriads of bytes of information derived directly
from a human body – from the processes we ourselves may not be
fully aware of.
Thus, this new topology of control – the middle space CheneyLippold speaks of – emerges at the intersection of biology, politics, and technology. Indisputably, it calls for the reconsideration
of the category of the body and its affects. A concept of biomedia, coined and developed by Thacker, is of particular help here:
according to Thacker, the informatic capacity to affect biological
materiality (the very blend of cybernetic and biological) results in
the recontextualization of a body that becomes “more than a body”,
since “the body you get back is not the body with which you began”
(Thacker 2003, 53). The space – the “middle space” where transformations of the body occur – where its affects are extracted and then
re/invested – is the opaque “black box”. This, on the one hand, allows
us to view the biomediated body as experimentally transcendental (in
a sense that the limits of the body are exceeded far beyond its physioanatomical functions), but, on the other, it appears to be extremely
susceptible to biopower.
Conclusion: From Affects to Knowledge
The problem of “black box” prompts another, more serious problem – the problem of epistemic injustice, hindering our capacities “as
a subject of knowledge” (Fricker 2007, 5). The best way to express
this problem is rather simple: algorithms know about us more than
we know about them. Obviously, algorithmic rationality has drastically transformed the cultural, social, political landscape we inhabit
today, as well as the very way we think about and understand it. This
Affect Trapped: Algorithms, Control, Biopolitical Security
leads Paško Bilić to believe that particular emphasis should be placed
on algorithmic literacy, accompanied by transparency and oversight
(Bilić 2018, 327). This is, however, not a case of “know thy enemy”,
but rather an invitation to acknowledge the change, interpret it in
a variety of ways, find means to comprehend it – and, most importantly, take active part in it. After all, the logic of this change, as it
has been noted, is affective, and affects are not only the change in
our bodies we experience but also, create.
However anachronistic it may sound, in conclusion, I would like
to refer to the most prominent theorist of affects of the 17th century
(if not the most prominent of all times) – Baruch Spinoza. As epistemic injustice – unequal distribution of knowledge among individuals – is directly linked to freedom, Spinoza’s remarks on the relation
between knowledge and freedom seem particularly insightful. In the
note to the tenth proposition of the fifth part of his Ethics, he states
that knowledge is liberating as long as we learn how certain things
affect us – and, vice versa, how we affect others (Spinoza 2001, 115).
Such a type of active, praxis-oriented epistemology, whose ultimate
goal is liberated knowledge, teaches us that we are an integral part
of the world we live in and thus fosters curiosity and openness to the
world, as well as encourages us to take an active part in building
this world, being responsible for ourselves and others. The ultimate
lesson we can learn from Spinoza is that knowledge, freedom, responsibility, and joy are four corners of the same square.
In a more contemporary context, recent attempts to escape the
claws of biopolitical capitalism – the advent of systems of decentralized control, the principle of net neutrality, socially responsible
hacktivism – all herald a productive reconciliation of contemporary
technology and emancipatory aims. All these and many more examples have been inspired by a totally different social imaginary from
the one that is constructed, both physically and digitally, by a capitalist regime; all these examples represent the urge to bring beyond
speculative imagination societies that, from the ruins of dismantled
hierarchies, would emerge as horizontal social spaces, societies that
would be based on sharing instead of owning, and on mutual responsibility and commonality instead of isolation and dividuality.
The true takeaway question is whether algorithms (as they are now)
would potentially have a place in this idyllic infrastructure.
231
232
Denis Petrina
References
Bilić, Paško. 2018. “A Critique of the Political Economy of Algorithms:
A Brief History of Google’s Technological Rationality.” TripleC 16 (1):
315-331.
Chazelle, Bernard. 2006. “The Algorithm: Idiom of Modern Science.”
Accessed January 14, 2020. https://www.cs.princeton.edu/~chazelle/
pubs/algorithm-print.pdf.
Cheney-Lippold, John. 2011. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture and Society 28 (6):
164-181.
Clough, Patricia. 2007. “Introduction.” In The Affective Turn: Theorizing the
Social, edited by Patricia T. Clough, and Jean Halley, 1-33. Durham and
London: Duke University Press.
Clough, Patricia. 2018. The User Unconscious: On Affect, Media, and Measure.
Minneapolis and London: University of Minnesota Press.
Curtis, Bruce. 2002. “Foucault on Governmentality and Population: The
Impossible Discovery.” The Canadian Journal of Sociology 27 (4): 505-533.
De Landa, Manuel. 2000. “Deleuze, Diagrams, and the Genesis of Form.”
American Studies 45 (1): 33-41.
Deleuze, Gilles. 1992. “Postscript on the Societies of Control.” October 59:
3-7.
Foucault, Michel. 1973. The Order of Things: An Archaeology of the Human
Sciences. New York: Vintage Books.
Foucault, Michel. 1995. Discipline and Punish: The Birth of the Prison. Translated by Alan Sheridan. New York: Vintage Books.
Foucault, Michel. 2003. Society Must Be Defended: Lectures at the Collège
de France 1975-1976. Translated by David Macey. New York: Picador.
Foucault, Michel. 2008. The Birth of Biopolitics: Lectures at the Collège de
France 1978-1979. Translated by Graham Burchell. New York: Palgrave
Macmillan.
Foucault, Michel. 2009. Security, Territory, Population: Lectures at the Collège de France 1977-1978. Translated by Graham Burchell. New York:
Palgrave Macmillan.
Fricker, Miranda. 2007. Epistemic Injustice: Power and the Ethics of Knowing.
New York: Oxford University Press.
Galloway, Alexander. 2012. “Computers and the Superfold.” Deleuze and
Guatarri Studies 6 (4): 513-528.
Gillespie, Tarleton. 2014. “The Relevance of Algorithms.” In Media Technologies: Essays on Communication, Materiality, and Society, edited by
Tarleton Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot, 167-193.
Cambridge and London: The MIT Press.
Affect Trapped: Algorithms, Control, Biopolitical Security
Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media. New Haven
and London: Yale University Press.
Lash, Scott. 2007. “Capitalism and Metaphysics.” Theory, Culture and Society
24 (5): 1-26.
Lazzarato, Maurizio. 2006. “The Concepts of Life and the Living in
the Societies of Control.” In Deleuze and the Social, edited by Martin
Fugslang, and Bent Meier Sørensen, 171-190. Edinburgh: Edinburgh
University Press.
Massumi, Brian. 2015. Ontopower: War, Powers, and the State of Perception.
Durham and London: Duke University Press.
Rouvroy, Antoinette. 2016. “Algorithmic Governmentality: Radicalization
and Immune Strategy of Capitalism and Neoliberalism?” La Deleuziana – Online Journal of Philosophy 3: 30-36.
Spinoza, Baruch. 2001. The Ethics. Translated by Robert Harvey Monro
Elwes. [Blakmask Online.]
Swenson, Kristin. 2011. “Affective Labor and Governmental Policy: George
W. Bush’s New Freedom Commission on Mental Health.” Baltic Journal
of Law and Politics 4 (2): 1-23.
Thacker, Eugene. 2003. “What is Biomedia?” Configurations 11 (1): 47-79.
Thacker, Eugene. 2011. “Necrologies; or, the Death of the Body Politic.”
In Beyond Biopolitics: Essays on the Governance of Life and Death, edited
by Patricia T. Clough, Jean Halley, 139-162. Durham and London:
Duke University Press.
233