Next Frontier

An Art Exhibition That Questions What It Means to Be Human

“Uncanny Valley,” on view at the de Young Museum in San Francisco, examines how artificial intelligence is shaping contemporary life.

Trevor Paglen, “They Took the Faces from the Accused and the Dead... (SD18),” 2019; © Trevor Paglen, Courtesy of the Artist and Altman Siegel, San Francisco.

At its birth in the 1950s, artificial intelligence augured a brave new world of machine learning and neural networks, of computers that would be able to extrapolate data and even demonstrate creativity. We now know, of course, that the technology would eventually be put to more prosaic—and darker—uses. Today, AI mines and exploits our data in such multifarious, streamlined ways that its workings are almost invisible; some of these processes are intertwined with activities we love, just elements of modern-day convenience. Binging on Netflix feeds data to the $170 billion company’s algorithms to determine what new series you’ll like even before they’re produced. Meanwhile, AI has been used to create increasingly, and disturbingly, realistic “deepfake” videos that can be used to sow political discord or incite violence.

“Uncanny Valley: Being Human in the Age of AI,” an exhibition open until October 25 at the de Young Museum in San Francisco, attempts to expose the multilayered channels through which AI technology is shaping contemporary life. “A key question underlying this show is whether our understanding of humanness is still valid or whether it’s time to redefine it,” says Claudia Schmuckli, the museum’s curator-in-charge of contemporary art. Fifteen artists and collectives, including closely watched figures like Simon Denny, Hito Steyerl, Trevor Paglen, and Martine Syms, arrive at very different—sometimes damning, sometimes irreverent—stances on where AI is taking us.

“Uncanny Valley” takes its title from the robotics term coined in 1970 to describe the existential despair triggered by lifelike machines mimicking humanity. The exhibition, located near Silicon Valley, where many of AI’s advancements originated, goes beyond the dystopian sci-fi fantasies of yesteryear (Stanley Kubrick’s 1968 film 2001: A Space Odyssey, say, with its homicidal HAL computer) to reflect on how AI affects our new normal. The artists dissect AI’s micro- and macro-level impacts, from how it’s recalibrating our personal relationships to the ways it has enabled surveillance states, economic inequity, and, most catastrophically, planetary-scale ecological crises.

(Left to right) Simon Denny, "Amazon worker cage patent drawing as virtual King Island Brown Thornbill cage (US 9,280,157 B2: ‘System for transporting personnel within an active workspace’, 2016)”; The Zairja Collective, “Neuropit #13 (visual cortex neuron)”, from the “Neuropit” series, 2016–present; Brain Imagery: The Allen Institute.

Denny, for instance, draws a link between the near-future extinction of one of Australia’s rarest bird species and human rights abuses by tech behemoths. “The more we learn about the disastrous ecological consequences of big data,” the artist says, “the more animals we witness giving their life to warn humans of toxicity.” A 2019 sculpture he contributes to the show is based on a patent diagram that Amazon filed for a tool to move workers around its warehouses. Denny’s 3D-printed cage is populated by an augmented reality (AR) thornbill generated by feeding all existing visual and sonic data of the ultra-rare bird into an AI system. The bird is visible on an AR smartphone app, and when enough visitors crowd around the cage, the excitable simulation flutters its wings and emits a call so rare that few have heard it in the wild. Juxtaposing the raw abjection of an architecture of containment with the beauty and precarity of a simulated animal, Denny’s sculpture is both sublime and horrifying, balancing on the knife edge of the uncanny valley. “The bird is the proverbial canary in the coal mine of our anthropocentric capitalocene,” Denny says.

Other artists in the show take a lo-fi approach to the complex subject. Scattered around the gallery like a toppled Ugo Rondinone rock sculpture is Agnieszka Kurant’s A.A.I. 10–15 (2017), a series of neon-colored termite colonies made by feeding the insects brightly colored sand, gold, and crystals. The work’s title is an acronym for “Artificial Artificial Intelligence,” a term allegedly coined by Amazon to describe menial labor that can be performed more efficiently by humans than current AI systems. Kurant applies the idea to termites. Her sculptures, each of which takes the bugs some five months to make, are meditations on outsourced non-human labor; the Instagram-friendly mounds can also be read as a critique of the art world, which all too often subsists on unseen work that is unpaid or underpaid.

Pierre Huyghe, “Exomind (Deep water),” 2017; Courtesy of the artist/Marian Goodman Gallery, New York/Hauser & Wirth, London/Esther Schipper, Berlin/Chantal Crousel, Paris/Taro Nasu, Tokyo/The National Museum of Modern Art, Tokyo.

Notably, the show’s most established artists, Hito Steyerl and Trevor Paglen, are the most overt critics of AI. Their works examine, respectively, an AI surveillance system trained to recognize the sound of windows being smashed (a technology used in home-safety devices and alarm systems) and another that scans American mugshots to develop a database for facial-recognition software (a tool now used by many law enforcement agencies, despite warnings from watchdog groups that it exhibits racial biases). Together, Steyerl and Paglen paint a stark picture of how the technology dehumanizes us and perpetuates cultural and socioeconomic inequality.

The show’s younger artists, perhaps especially attuned to the double-edged nature of AI, hijack fiction and near-future scenarios with dizzying results. Lawrence Lek’s seductive feature-length film AIDOL (2019) tells a tragicomic tale of Love Island ilk: A failing pop star enlists an entrepreneurial AI satellite to write her swan-song halftime show at the upcoming E-Sports Olympics. Addressing the racial and gender biases embedded in AI-enabled voice assistant technology, Martine Syms refers to her text-based avatar Mythiccbeing (2018) as an “anti-Siri.” Texting a number provided in the exhibition delivers a stream of frazzled messages from the self-centered avatar, who’s keen to share gossip “dispatches” from her own life and is hellbent on not answering the questions you ask.

Using the Unreal engine, Forensic Architecture generated thousands of photorealistic 'synthetic' images, situating the Triple-Chaser in approximations of real-world environments. Colored “masks” tell the classifier where in the image the Triple-Chaser grenade exists. Image: Forensic Architecture/Praxis Films, 2019.

“The word ‘human’ refers to more than a biological category for our species,” says Christopher Kulendran Thomas, whose video installation Being Human (2019), made in collaboration with Annika Kuhlmann, is the show’s most provocative work. “It’s an ideology based on a religious belief from the Western world. Perhaps it’s the necessary fiction needed to maintain an empire.” The video traces the impact of the nascent contemporary art market in Sri Lanka, picking apart the veneer of woke individualism that trails the global parade of art fairs and biennales.

“Uncanny Valley” makes clear that our relationship to AI is not as one-dimensional as automation stealing our jobs, as President Obama warned in his 2017 farewell address, or ushering in an era of full-time leisure. Seeping deep into the fabric of our work, life, relationships, and very identities, AI remaps what it means to be human. The show presents a cross-section of this reality, depicting how artificial intelligence is engineering forms that are perhaps even more us than we are. Naturally, the results are unsettling.

(All images Courtesy of the Fine Arts Museums of San Francisco)

This story appears in the March issue of Surface. To experience the complete issue subscribe here.

All Stories