Vanishing Point is an ongoing investigation into artificial intelligence asking if and how we can approach questions of how AIs experience both their tasks and the world that supports and is formed by those tasks.
This question was the subject of a month-long residency at Zaratan Arte Contemporânea in Lisbon, Portugal. The following is a press release for the final show by curator Gemma Norris.
Zaratan is pleased to present the work of Chris Wood, artist in residency at Zaratan, who is presenting his latest participatory project.
Interested in imaginaries around technology, especially the ways emerging technology define our experiences of time and space, Chris Wood has been recently exploring and working with artificial intelligence, question the boundaries between human and machine and how media represents the world.
The result of this process is a participatory audiovisual installation that includes materials produced by AI, in the form of text and sound. While the video comprised of footage of robots and server farms, intercut with other non-human intelligences (mushrooms and siphonophores), the sounds here - as well as the text - are the outputs of a wavenet deep neural network. The network was fed 3 hours English folk music from the late 60s (Anne Briggs, Shirley Collins, Vashti Bunyan) - mostly unaccompanied voice and asked to generate new material from this dataset.
The second part of the installation allows the entrance of one person at a time. It is a private room and an invitation to take up an intimate space with the machine learning algorithm. Creating space of interaction, Chris Wood invites a form of participation, suggesting that technology can establish relations between people while preserving and acknowledging different planes of existence. The audio in the headphone is a recording of the discussion on AI, used as a dataset for an algorithm to generate new audio. There are two excerpts on the audio track - one taken after the algorithm had reviewed the original audio 10450 times, the other after 44450 times. The book on display is a copy of Stanislaw Lem's Solaris, which centers on an encounter with incomprehensible intelligence; an ocean on another planet which causes hallucinations in the humans who approach it. Visitors are invited to draw their impressions of the audio onto the pages of the book.
The residency also involved a participatory workshop following an open call for collaborators. 20 people attended and were asked to respond to create a scenario in response to the following prompt:
In 2040 AIs have overtaken a huge variety of tasks; driving cars and trucks, deciding mortgage applications, granting benefits, offering parole. It has been discovered they such programs are less prone to breakdown and more able to work effectively when they are flipped into 'sleep mode', where they re-process the data they have just run through different neural pathways. It is recommended that each program spends half its time in 'sleep mode'. While some see this as just another form of machine learning training, the 'dreaming' also causes outcomes such as production of sound and images in the software's logs.
Following the workshop the group held a discussion on the possibility of Artificial Intelligence Experience. A recording of this discussion formed the source material for audio generated by a wavenet neural network for the installation.
The residency concluded with a live sound performance containing an improvised montage of the audio generated by the wavenet neural network throughout the residency. The performance recording is below.