Conversations: a guest post by composer and pianist Garreth Brooke

AI—with all its possibilities and possible risks—is upending every industry, including music. With AI being able to compose pieces in the style of any composer, living or dead, many fear the technology will eventually edge out living musicians. Some musicians shun it. Others, such as composer and pianist Garreth Brooke choose to engage with it.

In his latest EP, Conversations, Brooke, in his own words, “sought out some unusual sources of inspiration by duetting with three uncommon musical partners: a non-musician…a music-generating AI…and a houseplant…” The results were both musical and intriguing enough for me to contact Brooke and ask him if he’d be willing to write an article about how and why he created this music.

Garreth Brooke is a creative and sensitive musician who has appeared on No Dead Guys in an interview as well as being featured in an article on contemplative piano music. He is equally gifted as a writer and it’s an honor to feature him as a guest author on my blog.


A guest post by Garreth Brooke

I’m very grateful to Rhonda for inviting me to contribute again to her blog, of which I’m a regular reader and big fan. I work as a composer and teacher. My latest release is a short one called Conversations, and it caught Rhonda’s eye and ear. Before I explain a bit about the release, I’d like to briefly explore one of the key questions that led to it: where does music actually come from?

It’s a question that all composers ask at some point. How does this wonderful thing music, that nourishes us so profoundly, actually come to us? 

When I was in university I vividly remember reading a passage in Edward Elgar’s diaries—unfortunately I don’t have the reference to hand—in which he talks about the music flooding into him as if channeled from heaven. Even as a committed agnostic, this description matched how I experienced the rare and wonderful moments of composing when I was on top form, when the Muse was with me: the notes that I played just seemed to come to me, unprompted, inexplicable, wonderful in the true sense of the word. It’s an experience of elation that is very rare and very precious.

Teachers can learn a great deal from our students. I’ve tried to get all my piano students to compose at one point or the other, with varying success. There’s a delicate balance to strike: too little theory leads to a hot mess; too much theory is stifling. Forrest Kinney’s gentle Pattern Play approach seems to work well with the largest number of students: through improvised duets with their teacher, most students become quite comfortable being creative, especially if you start when they are still at a young age. 

But there’s a difference between composing and inspired composing and despite many years of trying, I’m a long way from figuring out how to guarantee the preconditions for a student to be inspired. Certain things are crucial—curiosity, focus, timing, a well-judged level of challenge—but even when all those stars align, you just can’t guarantee it. Similarly, I know that I am prone to visits from the Muse if I have listened to lots of other wonderful music in the last few days, am well-slept, and have access to a wonderful piano, but all these conditions are not necessarily sufficient. Timing seems to play an important role for me, but it’s hard to define precisely what effect it has: sometimes the Muse comes very easily in very early mornings, sometimes I hit on inspiration when I have 10 minutes to spare in between teaching, sometimes anger or sadness seems to trigger something — but none of these things are reliable.

In Conversations I sought out some unusual sources of inspiration by duetting with three uncommon musical partners: a non-musician (my artist partner, Anna Salzmann); a music-generating AI (Google’s Magenta); and a houseplant (attached to a MIDI Biodata Sonification Device). Each of them involved a strikingly different process, each led to some inspiration, which then led to music. Below I’ll explore a little of the process for all three, complete with musical examples.

I’ll begin with the houseplant, because it’s the one that seems to provoke the largest number of questions. The MIDI Biodata Sonification Device reads fluctuations in galvanic conductance from electrodes attached to two leaves, which are then transformed into MIDI data. No, I don’t really know what it means, but I like to repeatedly say it anyway whilst pulling a thoughtful face: “fluctuations in galvanic conductance”. In other words, it works “because science.”

In any case, you can use this MIDI data to control any virtual instrument, so I attached it to PianoTeq, set up a nice felted piano sound, and this is what came out:

If you’re anything like me you’ll find that there are moments of real beauty in this (I really like 0:19-0:24, for example) but it’s too dissonant for my compositional style, so I applied a series of filters that transposed all notes into the key of Eb major and also shifted all the plant’s notes into the higher registers of the piano, the ones typically used for melody. I also added a load of reverb and delay. This is the result:

Duetting with a plant is unusual in many ways, and amongst the most practical challenges is simply that you can’t tell it when to start and stop. In this recording I simply connected the plant, listened carefully, and joined in when I felt appropriate, improvising freely, trying to react to melodies I heard. After several attempts, here’s the final version:

Duetting with the Magenta AI was a very different process. Magenta works in several ways, but I chose to use the “continue” tool, i.e. I would improvise a melody and ask Magenta to continue it. Unfortunately I don’t have the original files for the piece, but I’ve mocked up some new ones so you can get an idea of how it works. Here’s my melody:

And here are some of Magenta’s responses. I’ve ordered them by plausibility, i.e. the ones towards the top are the least weird—although as you’ll hear, most of them are pretty damn weird. In practise I frequently deleted most responses immediately! 

I would then react to the AI’s own melody with a response of my own, and then ask it to react to that response. I repeated this process several times until I had a somewhat coherent melody, then composed an accompaniment to fit with it, then re-recorded my part on my lovely acoustic piano. Here it is:

The final pair of conversations were with my partner Anna Salzmann, does almost all of the artwork for my music. She is a real music fan and loves singing along to her favourite songs, but has never studied music formally nor played an instrument. I placed her hands on the keys of the Eb major scale, told her that Eb was the “home note” (i.e. the tonic) and that it would probably be a good idea to end with that note, and otherwise just encouraged her to move her fingers freely, listen to what sounds come out, and try to imagine it as a song. After she’d got used to the process and was confident of being able to stay in Eb major, she started to play freely. A lot of what she did was fairly random, but every now and then she would accidentally hit on an interesting idea, and then she would extend it, repeat it or try to vary it. We played for about 30 minutes, and there were two takes which stood out to me as particularly melodic. Here they are:

A few months later and some of the tracks still seem rather lovely to me, others feel a bit flat. I’m curious to know how you, the reader, react to them. The AI track provoked a strong negative response from one fan, which I enjoyed reading and to some extent agreed with (to their surprise!) 

I’d be particularly interested to hear from fellow composers. Have you experimented with plant-generated music? What about AI? How well did you feel it worked? Does AI feel like a threat? Leave some comments below, I’d love to hear your thoughts.

You can stream and/or download Conversations on your favourite music service by following the links here: https://songwhip.com/garrethbroke/conversations 

If you enjoyed reading this, you might enjoy reading my thoughts about music more broadly: I write occasional music reviews at the excellent site acloserlisten.com

You can find more of my work at garrethbrooke.com and join my Patreon community at patreon.com/garrethbrooke.


Garreth Brooke has been composing music since his early teens. Initially it served as a source of pleasure, but it gradually became an escape from the crisis that consumed his family during his mother's severe battle with depression, which led to a suicide attempt.

After studying music at the University of Oxford, he began collaborating with German contemporary artist Anna Salzmann on a series of works that combined music and art, using the pen name Garreth Broke.

Since then he’s released music with 1631 Recordings, Bigo & Twigetti, THESIS and Moderna Records. His music has received airplay from BBC Radio 3, KEXP and NTS and has shared the stage with musicians including Hania Rani, Clemens Christian Poetzsch, and Simeon Walker.

He curated the Upright project, a charity fundraiser that brought together sheet music for a wide range of contemporary composers including Michael Price, Nat Bartsch, and Peter Broderick. In his music he explores what it means to be human in a harsh but beautiful world.

"a sensitive and profound artist" - PianoDao

"Lovely... genuinely uplifting" - Stationary Travels

“cuts through the noise of overwrought emotions and cheap platitudes, and speaks directly the heart, reminding people through the beauty of music and art that others have been there too.” - No Dead Guys

garrethbrooke.com
Patreon / Twitter / Facebook / Instagram / Spotify / YouTube

Previous
Previous

The self-forgiving musician

Next
Next

6 articles on making money making music