Abbey Road Red

Interview: Karim Fanous, Abbey Road Red

‘The process of creating and listening to music is linked to our human selves and nature. It’s relational, social and linked to the stories of our daily lives. Technology doesn’t replace that. But it can add to it.’

Maya Radcliffe
  • By Maya Radcliffe
  • 6 Aug 2020
  • min read

Building on a legacy of more than 80 years of groundbreaking technological advances at the Studios, Abbey Road Red is Abbey Road Studio's open innovation department designed to support the endeavours of the brightest music tech entrepreneurs, researchers and developers. They run a unique music tech start-up incubation program to support the most promising music tech startups, as well as collaborating with the brightest minds in academic research.

One of the alumni of the incubator programme is the company Lifescore, co-founded by Tom Gruber, who is also the co-founder of Apple technology, Siri.

Lifescore is an adaptive music platform that creates unique, real-time soundtracks. World-class musicians and composers record and compose musical building blocks before the musical raw material is processed by their proprietary AI platform to generate soundtracks that adapt to a listener’s environment.

M recently spoke to Karim Fanous, innovation manager at Abbey Road, about the future of adaptive music, the industry’s approach to AI and Abbey Road’s role in the evolution of music.

'We’ve been trying to play our part here and have been exploring AI in music at Abbey Road Red for the last five years. We’ve witnessed first-hand the amazing potential that it has as an enabling force in music.'

You’re quoted as saying that ‘AI will just fold into our daily lives’, but many artists, musicians and industry workers fear that very thing happening. How do you address these concerns?

The first thing I would say to help address any fear around this is that technology will not replace our interaction with music.

The process of creating and listening to music is linked to our human selves and nature. It’s relational, social and linked to the stories of our daily lives. Technology doesn’t replace that. But it can add to it.

The same goes for our listening moments. When we experience music actively or passively there is always context around us and the experience. It doesn’t feel like this is directly interchangeable with an AI or piece of music tech. But you can use it in the process.

Take the synthesiser as an example. When Moog took it mainstream in the ‘60s I can imagine it being fascinating to some people while terrifying others! But look what happened. Here we are five decades later with myriad new genres that have sprung on top of it, new creative possibilities in our soundscapes and improvements in our workflows and automation on the production side.

'We’ve been trying to play our part here and have been exploring AI in music at Abbey Road Red for the last five years. We’ve witnessed first-hand the amazing potential that it has as an enabling force in music.'

So with AI, imagine a creative world in which you and your collaborators are provided with a tool that could birth new genres, increase your musical expression and improve your workflow in the same way that the synthesiser did. Thinking about the new genres and sounds in itself is mind-blowing. I try to imagine it and I can’t, and that’s the point.

Just this week as a judge on a hackathon for Patch XR I’ve seen some awesome sounds and visuals created using audio and visual synthesis in its VR environments. From hybrid instruments played live to live coding and generative sounds.

To the point of AI folding into our daily lives, I would say that it’s just the way things are now. When you use voice commands, Uber, Google, follow a Netflix recommendation or look at an online advert you are engaging directly with AI or something it has played a part in determining.

When you use a music service, algorithms, machine learning and artificial intelligence, or a combination, have been used to power personalised or contextual playlist creation, artist and track categorisation and more - even what your homepage looks like.

'We’re taking our time to understand the technology, whether it is machine learning, AI, blockchain or immersive audio tech so that we can understand how it can enhance experience for artists and fans and help develop it.'

We’ve been trying to play our part here and have been exploring AI in music at Abbey Road Red for the last five years. We’ve witnessed first-hand the amazing potential that it has as an enabling force in music. That’s why we’re excited about it and see it as a positive force if used in the right way.

To take some examples from our alumni: generative technologies can help you make music if you don’t know how to play an instrument (Humtap); machine learning can help an intelligent microphone system learn which sounds your voice is making and translate them into soft synth sounds in real-time (Vochlea), which in turn can help you create demos quickly without getting bogged down in setting up templates or sounds, or opens up new possibilities for performing or recording live; and AI will help adaptive music platforms make brilliant decisions about how to evolve music for you in real-time in a personalised way (LifeScore).

Do you think that the music industry needs to be cautious in its approach to AI?

Yes! Who doesn’t? And that counts for all tech. The music industry is engaging with all kinds of technology that will impact music going forward. We’re taking our time to understand the technology, whether it is machine learning, AI, blockchain or immersive audio tech so that we can understand how it can enhance experience for artists and fans and help develop it.

'We work with the industry and artists and we provide an interface for them to learn about, understand and access this technology. We like to think we’re trying to help change the industry for the better as a result.'

Do you believe that technology is changing the industry for the better?

Another resounding yes! From a consumer perspective we have moved from single CD purchases to now having the whole world’s music library and ultra-personalised taste on demand in our pocket. Artists have more visibility and connection with their fans. From a studio perspective tech helps artists do so much more before and after the studio session to make the most of the recording and production process.

How important is Abbey Road Red to the evolution of the music industry as we know it?

Our mission at Red is to build on the incredible 89-year legacy of innovation at Abbey Road Studios and link that past to our industry’s future. We want to try and help introduce new universally adopted technologies into the music industry in the same way our predecessors did, from stereo to ADT (Artificial Double Tracking) and the blueprint of the modern recording console, but this time by helping the brilliant founders we discover and meet.

We work with the industry and artists and we provide an interface for them to learn about, understand and access this technology. We like to think we’re trying to help change the industry for the better as a result. We’re trying our best at the very least.

'Our mission at Red is to build on the incredible 89-year legacy of innovation at Abbey Road Studios and link that past to our industry’s future,'

What does the future of adaptive music look like to you?

It looks like what you want it to sound like, literally! Artists can have evolver tools which evolve pieces of music that they design and play; fans can have soundtracks that adjust to suit their activities and context; wellness professionals can design personalised music for their clients; medical professionals could prescribe music that adapts to counteract swings in mood or reduce anxiety.

With this technology the possibilities are really endless and the brilliant team at our alumnus LifeScore has proved that you can have this technological possibility and marry it with human performance and sounds.

Find out more about Abbey Road Red and Lifescore