Trevor Paglen - Invisible Images
Trevor Paglen - Invisible Images
Trevor Paglen - Invisible Images
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 1/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
“Winona”
Eigenface (Colorized),
I. Labelled Faces in the Wild
Dataset
OUR eyes are fleshy things, and for most of human 2016
We’ve gotten pretty good at understanding the vagaries of human vision; the
serpentine ways in which images infiltrate and influence culture, their tenuous
relationships to everyday life and truth, the means by which they’re harnessed to
serve--and resist--power. The theoretical concepts we use to analyze classical
visual culture are robust: representation, meaning, spectacle, semiosis, mimesis,
and all the rest. For centuries these concepts have helped us to navigate the
workings of classical visual culture.
But over the last decade or so, something dramatic has happened. Visual culture
has changed form. It has become detached from human eyes and has largely
become invisible. Human visual culture has become a special case of vision, an
exception to the rule. The overwhelming majority of images are now made by
machines for other machines, with humans rarely in the loop. The advent of
machine-to-machine seeing has been barely noticed at large, and poorly
understood by those of us who’ve begun to notice the tectonic shift invisibly
taking place before our very eyes.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 2/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
Cultural theorists have long suspected there was something different about
digital images than the visual media of yesteryear, but have had trouble putting
their finger on it. In the 1990s, for example, there was much to do about the fact
that digital images lack an “original.” More recently, the proliferation of images
on social media and its implications for inter-subjectivity has been a topic of
much discussion among cultural theorists and critics. But these concerns still fail
to articulate exactly what’s at stake.
One problem is that these concerns still assume that humans are looking at
images, and that the relationship between human viewers and images is the most
important moment to analyze--but it’s exactly this assumption of a human
subject that I want to question.
What’s truly revolutionary about the advent of digital images is the fact that they
are fundamentally machine-readable: they can only be seen by humans in
special circumstances and for short periods of time. A photograph shot on a
phone creates a machine-readable file that does not reflect light in such a way as
to be perceptible to a human eye. A secondary application, like a software-based
photo viewer paired with a liquid crystal display and backlight may create
something that a human can look at, but the image only appears to human eyes
temporarily before reverting back to its immaterial machine form when the
phone is put away or the display is turned off. However, the image doesn’t need
to be turned into human-readable form in order for a machine to do something
with it. This is fundamentally different than a roll of undeveloped film. Although
film, too, must be coaxed by a chemical process into a form visible by human
eyes, the undeveloped film negative isn’t readable by a human or machine.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 3/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
Lake Tenaya
Maximally Stable External
Regions; Hough Transform
2016
II.
This invisible visual culture isn’t just confined to industrial operations, law
enforcement, and “smart” cities, but extends far into what we’d otherwise--and
somewhat naively--think of as human-to-human visual culture. I’m referring
here to the trillions of images that humans share on digital platforms--ones that
at first glance seem to be made by humans for other humans.
On its surface, a platform like Facebook seems analogous to the musty glue-
bound photo albums of postwar America. We “share” pictures on the Internet
and see how many people “like” them and redistribute them. In the old days,
people carried around pictures of their children in wallets and purses, showed
them to friends and acquaintances, and set up slideshows of family vacations.
What could be more human than a desire to show off one’s children? Interfaces
designed for digital image-sharing largely parrot these forms, creating “albums”
for selfies, baby pictures, cats, and travel photos.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 5/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
comparable to what a human can achieve, ignoring for a second that no human
can recall the faces of billions of people.
“Goldfish”
Linear Classifier, ImageNet
Dataset
2016
?“Fire Boat”
Synthetic High Activation,
ImageNet Dataset
2016
III.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 6/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
than classically representational ones. But that isn’t to say that there isn’t a formal
underpinning to how computer vision systems work.
All computer vision systems produce mathematical abstractions from the images
they’re analyzing, and the qualities of those abstractions are guided by the kind
of metadata the algorithm is trying to read. Facial recognition, for instance,
typically involves any number of techniques, depending on the application, the
desired efficiency, and the available training sets. The Eigenface technique, to
take an older example, analyzes someone’s face and subtracts from that the
features it has in common with other faces, leaving a unique facial “fingerprint”
or facial “archetype.” To recognize a particular person, the algorithm looks for
the fingerprint of a given person’s face.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 7/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
The point here is that if we want to understand the invisible world of machine-
machine visual culture, we need to unlearn how to see like humans. We need to
learn how to see a parallel universe composed of activations, keypoints,
eigenfaces, feature transforms, classifiers, training sets, and the like. But it’s not
just as simple as learning a different vocabulary. Formal concepts contain
epistemological assumptions, which in turn have ethical consequences. The
theoretical concepts we use to analyze visual culture are profoundly misleading
when applied to the machinic landscape, producing distortions, vast blind spots,
and wild misinterpretations.
(Research Image)
“Disgust”
Custom Hito Steyerl Emotion
Training Set
VI.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 8/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
Ideology’s ultimate trick has always been to present itself as objective truth, to
present historical conditions as eternal, and to present political formations as
natural. Because image operations function on an invisible plane and are not
dependent on a human seeing-subject (and are therefore not as obviously
ideological as giant paintings of Napoleon) they are harder to recognize for what
they are: immensely powerful levers of social regulation that serve specific race
and class interests while presenting themselves as objective.
Take the case of Vigilant Solutions. In January 2016, Vigilant Solutions, the
company that boasts of having a database of billions of vehicle locations
captured by ALPR systems, signed contracts with a handful of local Texas
governments. According to documents obtained by the Electronic Frontier
Foundation, the deal went like this: Vigilant Solutions provided police with a
suite of ALPR systems for their police cars and access to Vigilant’s larger
database. In return, the local government provided Vigilant with records of
outstanding arrest warrants and overdue court fees. A list of “flagged” license
plates associated with outstanding fines are fed into mobile ALPR systems. When
a mobile ALPR system on a police car spots a flagged license plate, the cop pulls
the driver over and gives them two options: they can pay the outstanding fine on
the spot with a credit card (plus at 25 percent “service fee” that goes directly to
Vigilant), or they can be arrested. In addition to their 25 percent surcharge,
Vigilant keeps a record of every license plate reading that the local police take,
adding information to their massive databases in order to be capitalized in other
ways. The political operations here are clear. Municipalities are incentivized to
balance their budgets on the backs of their most vulnerable populations, to
transform their police into tax-collectors, and to effectively sell police
surveillance data to private companies. Despite the “objectivity” of the overall
system, it unambiguously serves powerful government and corporate interests at
the expense of vulnerable populations and civic life.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 9/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
Smaller and smaller moments of human life are being transformed into capital,
whether it’s the ability to automatically scan thousands of cars for outstanding
court fees, or a moment of recklessness captured from a photograph uploaded to
the Internet. Your health insurance will be modulated by the baby pictures your
parents uploaded of you without your consent. The level of police scrutiny you
receive will be guided by your “pattern of life” signature.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 10/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
V. (Research Images)
Magritte, Rosler, Opie
Dense Captioning, Age,
Cultural producers have developed very good
Gender, Adult Content
tactics and strategies for making interventions into
Detection
human-human visual culture in order to challenge
inequality, racism, and injustice. Counter-
hegemonic visual strategies and tactics employed by artists and cultural
producers in the human-human sphere often capitalize on the ambiguity of
human-human visual culture to produce forms of counter-culture--to make
claims, to assert rights, and to expand the field of represented peoples and
positions in visual culture. Martha Rosler’s influential artwork “Semiotics of the
Kitchen,” for example, transformed the patriarchal image of the kitchen as a
representation of masculinist order into a kind of prison; Emory Douglas’s
images of African American resistance and solidarity created a visual landscape
of self-empowerment; Catherine Opie’s images of queerness developed an
alternate vocabulary of gender and power. All of these strategies, and many
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 11/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
more, rely on the fact that the relationship between meaning and representation
is elastic. But this idea of ambiguity, a cornerstone of semiotic theory from
Saussure through Derrida, simply ceases to exist on the plane of quantified
machine-machine seeing. There’s no obvious way to intervene in machine-
machine systems using visual strategies developed from human-human culture.
Faced with this impasse, some artists and cultural workers are attempting to
challenge machine vision systems by creating forms of seeing that are legible to
humans but illegible to machines. Artist Adam Harvey, in particular, has
developed makeup schemes to thwart facial recognition algorithms, clothing to
suppress heat signatures, and pockets designed to prevent cellphones from
continually broadcasting their location to sensors in the surrounding landscape.
Julian Oliver often takes the opposite tack, developing hyper-predatory
machines intended to show the extent to which we are surrounded by sensing
machines, and the kinds of intimate information they’re collecting all the time.
These are noteworthy projects that help humans learn about the existence of
ubiquitous sensing. But these tactics cannot be generalized.
In the long run, developing visual strategies to defeat machine vision algorithms
is a losing strategy. Entire branches of computer vision research are dedicated to
creating “adversarial” images designed to thwart automated recognition systems.
These adversarial images simply get incorporated into training sets used to teach
algorithms how to overcome them. What’s more, in order to truly hide from
machine vision systems, the tactics deployed today must be able to resist not
only algorithms deployed at present, but algorithms that will be deployed in the
future. To hide one’s face from Facebook, one would not only have to develop a
tactic to thwart the “DeepFace” algorithm of today, but also a facial recognition
system from the future.
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 12/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
CONTINUE
CONTACT SUBSCRIBE
SUBMIT MANAGE SUBSCRIPTION
DONATE BROWSE THE ARCHIVE
ABOUT TERMS OF USE
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 13/14
5/8/24, 8:38 AM Invisible Images (Your Pictures Are Looking at You) – The New Inquiry
SUBSCRIBE TO NEWSLETTER
https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ 14/14