On the 8th February, the Southbank offers audiences and aspiring creatives a unique insight into the often unseen world of AI-assisted music creation, through their Open Rehearsal: Humans & Machines. Featuring a behind the scenes look at experimentation from electronic musicians, Ben Hayes and Hector Plimmer, the duo will showcase the wealth of possibilities for creativity by way of musical and visual interfaces, alongside how others can use AI to spark new creative avenues for electronic composers.
Set at the hallowed Purcell rooms the Open Rehearsal: Humans & Machines is part of the Southbank’s Composer Collective series. Designed to enable burgeoning creators to get first-hand knowledge from experts in their fields, expand their musical horizons and share ideas with peers.
We sat down with Hayes and Plimmer to talk about the Feb 7th open rehearsal and performance taking place on the 8th, discussing what audiences can expect from the duo and their AI collaborator.
How did the project come about?
Hector: The initial version of this project was formulated at a dinner organised by Nesta whereby they invited lots of different people from gallery spaces, education and music to think about how we can approach education differently. The Southbank were attendees of the event, so that may have been what swayed them to get involved. Also, Google had previously asked me to test an instrument that utilised AI to generate sounds and that was a factor in my taking on this project.
Ben: I worked at a tech start-up that was interested in the use of deep learning to cultivate ways to create music. I got some exposure to this (AI-assisted music) through that experience. My path has converged both my musicianship and my interest in AI. Between us, this area was something that we had a background interest in any way, so it just felt natural to explore.
Can you tell us how the custom-built AI works and interacts with the music?
Ben: The AI’s musical understanding is developed by its listening to large quantities of pre-existing music and it pulls out characteristics of how humans structure music and creates its own representation. Due to the volume of data, the hope is to uncover some hidden structure and or new characteristics of music. It also responds to things we do and produces sounds that influence our real-time musical decision making. This creates an interesting feedback between humans influencing machine and vice-versa.
Hector: What we are trying to do is shine a light on how the AI will open up new ways to think about music that humans wouldn’t. It will throw up scenarios that are strange and weird but what’s interesting is how we respond and utilise that.
Ben: That feedback relationship is what’s interesting. If we do something unexpected, it may respond in kind and we could find ourselves in a really interesting, unexpected space.
Hector: It’s kind of weird because the musical mind of the AI is based on human music in the first place, so the process is collaborative between human to robot and the other way around.
Your musical offerings celebrate both futurist and traditional instrumentation. Was this project a step outside your comfort zones?
Hector: For me, it’s really out of my comfort zone because Ben is having to explain to me the workings of the tech. I am learning how the AI listens and also, things we take for granted about the act of listening. These things need to be viewed differently when working with AI. You could just train it to listen to specific features of music and overlook things. So, the tech side has been a challenge for me.
What is the role of the visuals in the performance?
Ben: It is reactive to the AI. The idea is, it will elucidate some of the decisions being taken by the AI elements of the performance. Often what is a little opaque about incorporating AI into performance practice is that it can be unclear who is contributing what to the performance. I feel that this lack of clarity can contribute to the hysteria and fear around AI coming close to the creative space. So, in effect, it is almost going to be informational visuals but that’s not to say they won’t be visually pleasing though.
One of the reasons your Creators Project is so interesting is due to the demystifying of the relationship between human and AI, in the creative space, allowing more people to see the possibilities of the technology. Was this an aim of the project?
Hector: Learning about the community of coders and developers, it seems like the people who are in it for the right reasons are all willing to share. So, it is quite nice to be able to say, this is what we’ve been working on the past for days and if you wanted to give it a shot, then you can too.
Ben: We will also be releasing all the tools we use, open-source. So, people can download them, load them into their own programmes, play around and make their own projects or accelerate AI singularity and take over the world (wry smile).
Have you any performances or projects coming up?
Hector: I’ll be performing at the Late Night Jazz Season, at the Royal Albert Hall with Yazz Ahmed on Thursday 9th of April. Performing her music, improvising with her and maybe play some of my own material. I also have a new album, Next to Nothing out now.
The pair’s Purcell Sessions takes place in February at the Southbank Centre.
Tickets and more info here.
Words: Matthew O’Hare