The Role of the Artist in Society’s Latent Space.

Mike Heavers
5 min readApr 28, 2024

--

In the age of AI, artists are crucial to examining and subverting the overfitted aspects of our culture.

Latent Space Landscape Generation — by Author

Latent Space: A Place for Dis-Understanding

In the process of AI model training, there’s a concept called the Latent Space, a way of essentially opening up an AI’s mind in training to see what it imagines as it attempts to make sense of our world. It’s like watching a brain develop from a childlike understanding to a master in its field, only in hyper-speed.

These mid-stream imaginings are fascinating in their creativity. In the Latent Space, faces are blurry and half-formed, contorted and swirled to resemble strange pseudo-species, and poses defy the body mechanics that limit us in the physical world. Latent worlds themselves are a place where gravity is ill-defined, and where nature bleeds into infrastructure. The benefit of all this is that it can invite us to question the very societal constructs we might have previously accepted as fact, as we create our art, tools, and innovations.

Latent Space Image Training on Masked Dataset — by Author

As opposed to the near ubiquitous text-to-image generation modalities we use today (the image you get at the other end of your Midjourney /imagine prompt, for example), the latent space greatly benefits from human collaboration — to smooth out the grotesque uncanny valley, to add detail, to figure out what, in the midst of so much weirdness, resonates with popular culture. To watch the evolution of a model’s training mid stream to learn how it is learning.

These days though, multimodal AI has gotten so good at understanding us as a homogenous bundle that its aesthetics have become uber-polished and hyperreal — in a way that makes it difficult to generate any of the creative mis-figured aesthetics of yore, but trivial to conjure via a text field on openAI or inside Discord, leaving us with a sort of bland middle-of-the-mall normality across our renderings.

Meanwhile the text-to-image interface of these models is so streamlined and the model so widely versed, we forget to tinker with the internals (or perhaps even with the end results). Our iteration comes almost exclusively in the form of prompt engineering, which is increasingly constrained by its creators’ meta prompts, constantly trying to anticipate misuse, airing overtly, sometimes extremely, on the side of caution or over-representation.

“America’s Founding Fathers, Vikings, and the Pope according to Google AI”, post on X by EndWokeness. Google took down its Gemini image generation feature shortly thereafter.

Where does this leave the artist?

I wanted to hear from artists themselves how they are thinking about and using AI. I asked my friend and digital artist Peter Burr. Peter has made a career of subverting dominant aesthetics and narratives.

Natural Contacts by Peter Burr and Mark Fingerhut. “A 24-hr durational piece of malware that slowly transforms your computer into a verdant jungle.”

I asked Peter about how his work has changed with the rise of AI. He mentioned how Stable Diffusion was able to help him quickly generate concept sketches for pitch decks and grant applications, and chatGPT helped him ideate on the narratives that weave throughout his projects. In some sense, for Peter as well as many others, myself included, AI has gone from deep dreamer to productivity tool — taking care of our unwanted tasks.

The Supersensorium and the Trouble with Overfitting

This shift could perhaps been interpreted as a desire to build up tools and defenses against the inundation of tech and capitalism, which seek to extract every ounce of our attention and productivity.

Writer and scientist Erik Hoel refers to our present technological environment as the “Supersensorium” — and the feeling we experience underneath its weight as the “Overfitted Brain Hypothesis.”

When you train a Machine Learning model too heavily on a particular dataset, it gets great at solving problems in that domain, but bad at solving for new information. This is called overfitting. And it isn’t just limited to AI.

Since the Mesozoic era animals have dreamed as a way to introduce nuance to our routine lives. Without dreams, the hypothesis goes, our own minds are also constantly at risk of overfitting. While sleep may be a cerebrospinal scrubbing of unnecessary memories, dreams introduce nuance to the things we do remember — they allow our minds to extrapolate, prepare us for new information, unexpected circumstances — to stay alive, evolutionarily, and to innovate, in modern times. But should they become overfitted —all those dreams do nothing.

The Artist as Seer

In a way, the artist could be seen as serving a similar function for society. They take a look at the overfitted aspects of our culture and try to spot the gaps, the cracks, the untold stories. They’re Seers of sorts, reaching in to untie the knot in our brains, introducing nuance to prepare us for unforeseen futures. If our present moment is any indication, those unforeseen futures are coming with increasing force and pace.

No potential future looms more ominous to me than climate collapse.

I recently attended an exhibit at the Portland Art Museum by the artist Hito Steyerl called “This is the Future”.

“This is the Future” Exhibit by Hito Steyerl

Steyerl’s narrative centers on Heja, who is sentenced to prison by a neural network that calculates that she will one day commit a crime. In prison she cultivates a garden that she protects from the guards by hiding it in the future. Here the plants evolve through the predictive powers of the neural network. These “Power Plants” learn to remedy a range society’s ills - social media addiction, a culture of overwork. As opposed to purely pointing out the dangers of AI, the piece signals a state of symbiosis between humans and AI — technology helping humans protect themselves from technology.

In the seemingly unstoppable tsunami-sized wake of AI and technical progress, it is a sort of relationship I’d like to have with machines.

--

--