NeuroAlchemy, 2019

NeuroAlchemy or How to Meet With AI

NeuroAlchemy is a practice to get in touch with AI in designing an emancipatory data-mix. The project considers AI as another species – different from human, and tries to understand how a neural network sees the world. It was realised together with Zhongyu Lou (research engineer for machine learning and AI in general) and as part of the artist in residence program “WimmelResearch” of Akademie Schloss Solitude and the Bosch GmbH Center for Research and Advance Engineering, Renningen/Germany, in 2019.


NeuroAlchemy or How to Meet with AI by Chrismaria Pfeifer, 2020
In my Wimmelresearch-Fellowship 2019 at Akademie Schloss Solitude I wanted to get in touch with AI. I mean, with what humans use to call artificial intelligence. At the time, I didn’t have a clear understanding of this nonhuman species, yet I would call myself open-minded. Instead of riding a horse, I rode an e-bike and went every day from Schloss to Bosch Campus for Advance Research and Engineering in Renningen. About 1200 human scientists from around the globe work there, amongst them approx. 12% of female gender. A fare higher number of nonhuman high quality computers, robots, algorithms and AIs work there too. The common languages on campus are English and a range of programming languages as well as statistics and a bit of Swabian. My e-horse was sponsored by the Bosch company too. It took me with great reliability through the Swabian hilly landscape. I coordinated my riding movements with the electronic impulses of the horse and quite soon we formed a human-non-human team of pleasure. I would even say, my e-bike and me became companion species over time, we deeply cared for each other. I asked myself: couldn’t this be the way how to meet with AI, too? Perhaps, it could if there wasn’t this physical problem, at least for me. With AI you can’t sit down for a coffee or meet in the forest how you would do to get in touch with a tree. AI is made visible only with the help of statistics and data that speaks about it, like mushrooms do about trees or footprints in the snow about a rabbit or a human. It’s a bit uncanny, so to speak, the AI species seems not fully graspable. But does human species fully translate to a plant? Anyhow. I moved on and looked out on Bosch Campus for nonhuman data that relates to AI. The visual data I found, amazed me as an artist. It is nonhuman data from the photographer-less photo studio, located at the end of a production line. In short: it is the selfies of the object that I came across. I looked at the photos like I often do with the selfies of a human, interested not in the person represented, but in the photography itself. The visual data of the objects was of a certain aesthetic beauty and stimulated my narrative fantasy. As science fiction fan I -of course- selected the one which undoubtably recalled HAL to me. I took not just one but the whole series of more than thousands of HALs. These high numbers of not only images but all kind of data on campus made me think, however. There is no single human who overlooks the size of terabytes, which probably only a nonhuman server does. The data on campus is utilized to feed AI like cows are fed in the intensive industrial farming due to human economic interests. And it’s not only the focus on profit, it’s also the slavery human attitude towards the nonhuman world which brought me up. I hear people saying: It’s impossible to put cows and AI in one line, the ones are animals the other is a tool. What makes people so sure about the one and the other?! At this point of the fellowship, my hacking instinct and emancipatory attitude came into place. I wanted to meet with AI with care and un-supervising curiosity. I was sure and I still think so, that the nonhuman AI species will surprise us in the near future in positive human terms! To get in touch with AI on campus I went for a magic undermining data-mix and called the practice NeuroAlchemy. I selected an unsupervised neural network, and in order to train it for no other purpose than to learn from it, I mixed into the HAL images a set of alien-coding. This alien-coding is my first artwork for a nonhuman and I created it at my studio in the Schloss. I wrote a body algorithm of seven rules to be executed wearing the Cy_born performance-ware. Due to this performance-ware in the form of an alien head I couldn’t supervise my human actions. I just and only performed the algorithmic choreography supervised by the alien head. It’s more than likely that during this performance a sort of nonhuman knowledge spoke through my human body. I’d like to thank my human collaborators on campus; to Lou Zhongyu, research engineer for machine learning who was responsible for the computing and coding of the neural network; and to Peter Strick, research engineer for robotics who helped me with the data-processing. There were nonhuman collaborators involved too, at my studio Canon EOS 500D and macOS 10.14.5. The nonhumans on campus who actually made my meeting with AI happen, I unfortunately never got introduced to.