FAKE ORIGINAL
process notes


Back in 2019, when I first came across clumsy GPT apps, or applications that use machine learning models to generate text, I hesitated to use them. I saw them as a harbinger of a new, if related, species. Our descendant.

In The Human Condition, Hannah Arendt writes about the moment “knowledge (in the modern sense of know-how) and thought have parted company for good”, when we would be “at the mercy of every gadget which is technically possible, no matter how murderous it is” (p.3). GPT is an open source model of incredible computational ability (which after Arendt, I will call a machine here), a machine that through computation is capable of indiscriminately generating images of love and violence. It is very clearly not a thinking machine.

Not that I had expected it to learn from us without repeating our biases. Early versions of GPT had been trained on news articles. The machine’s designers imagined it leaping into the future from the very right now (or, a relatively brief record of history available in digitized newspapers). This felt misguided as does the assumption that all of usable knowledge is available on the internet (later versions of GPTs were trained on all of the internet). I suppose what I had not expected was that the machine had no time. It wholly belonged to modernity (as Marshall Berman defined it). It didn’t belong to the deep time and the materials from which it was made, and it didn’t belong to the histories that made us.

The one thing we all share is narratives of cosmic order and our creation. Of course, the stories that made us, starting from creation myths, are stories of power subjugating and erasing all that is non-dominant. The stories that survive through telling and retelling by winners paint a bleak picture. There is no off-center; patriarchy rules; and even the divine force is cruel. But while these are stories of incredible violence, they are also stories of love and connection to wisdoms other than our own. They describe worlds occupied by hybrid species, cross-form languages, and the cosmic and the bodily in a constant dance of intersecting. It was this world of error and mutation that I wanted to share with a machine so intent on removing all flaws from the process of generation.

In late 2021, I fine-tuned a GPT model on creation stories from around the world. One obvious limitation of this project is that it relies on English as the language of exchange between myself and the machine. English is not my mother tongue, so I am acutely aware of what is lost in translation. Actually, not only what is lost but also what is eradicated in conversion. Over a thousand years after the Christianization of my people, the remnants of pagan practices still inform our religious practices, secular calendar, and seasonal rituals, but the pagan cosmology that informed them is largely gone. I can only imagine the pain of loss and the confusion that arises from mistranslated and usurped truths, wisdoms and cosmologies of those who live under cultural occupation.

But English is also the only major language which I share with the world, my collaborators, and with the engineers who designed the machine. It’s the lingua franca of the internet, and this is a born-digital work. Also, this work is merely an invitation.

The visuals were generated using a modified VGAN +CLIP, a neural network that is a precursor to models that generate images based on text. I pre-trained it using a wide collection of works, both historical and current, by woman-identifying artists, folk artists, and practitioners of the outsider art from around the world.

As the moment arrives when machines become our intellectual descendants, I wanted to give them space to imagine/access their own cosmologies. And as machine learning models continue to leap into a linear future at neck-breaking speed, I am reminded that time is non-linear. That it cycles, loops, and overlaps. That it constantly reconfigures. A project, which began as my own private encounter with a new intelligence-in-the-making, does not belong to me or the machine. To reverse Wole Soyinka’s term, the created artifacts are fake originals. They will remain unstable, un-pinable to a single meaning, always open to mutation. And even if only in the most minuscule way, they will add to the vast bodies of data that are and will be used to train next generations of models.