Invites Artist Presentation: Jake Elwes & Cameron Thomas
Joined by a collaborator, computational sound artist Cameron Thomas, artist Jake Elwes acts as VJ, creating a durational live audio visual performance in parallel to his current exhibition, CUSP. Visitors are encouraged to drop in and spend as much time as they wish.
The live performance is an experimental collaboration using machine learning algorithms. The visuals are produced using a generative neural network trained on a photographic dataset of marsh birds, having picked up properties it can then go on to create new forms that fluctuate between species. The performer guides the network through what it has learnt.
The audio has been similarly trained on birdsong, as well as on works from a classical repertoire inspired by birds; Vaughan Williams’ ‘The Lark Ascending’ & Jonathan Harvey’s ‘Bird Concerto with Pianosong’. The audio samples are being generated in realtime using a machine learning algorithm (created by Cameron Thomas), with no prior knowledge of the mechanics of sound or musical structure.