From lone genius to cocreator: how AI is changing the role of composers
Who is the real creator when a musician uses AI? This was the burning question for Adam Lukawski, himself a composer. During a fascinating premiere at Amare, The Hague’s cultural hub, he demonstrated what this cocreation sounds like.
Intriguing sounds fill the concert hall in Amare. The audience watches a piano play, as if enchanted, an experimental composition with no pianist in sight. Welcome to the world of Adam Lukawski. This Polish researcher is not only a composer but also a computer programmer.
He researched how new technologies are changing his work, and on 20 November, he premiered his latest compositions. To find out how they sound, watch the video and read more about his extraordinary PhD research at the Academy of Creative and Performing Arts.
Due to the selected cookie settings, we cannot show this video here.
Watch the video on the original website orNew roles
‘My research shows that composers assume new roles when working with AI,’ says Lukawski. ‘They become managers of smart tools, curators of AI-generated material, builders of new systems and designers of a creative ecosystem in which humans and AI work together.’
At Amare, he demonstrated each of those roles. He played excerpts he designed together with his AI systems. In the first half of the concert, pre-programmed instruments performed the pieces, whereas in the second half, human musicians – Ensemble Modelo62 – took over.
Art is not a technical problem you can easily solve
Experimental music
These new roles do not mean that composing has suddenly become a matter of typing in a few simple prompts, Lukawski was keen to explain. ‘AI is good at generating popular music like rock songs. But experimental music is a different story: there’s not enough material to train an AI system.’
Art, he argues, is not a technical problem you can easily solve. ‘On the contrary – making art means asking new questions. An experimental artist invents new ways of creating each time. You can’t automate that.’
Music from plain language
For his new work, Lukawski built intelligent computer systems. For the piece Allagma, his AI program generated the musical building blocks that he used to compose the music. He also designed the AI system Chain of Thoughts.
In this program, he didn’t write music as notes but as sentences. Bar by bar, he described in plain language what should happen, and the program turned these instructions into music. ‘The funny thing is,’ says Lukawski, ‘that the AI model interprets the same instruction slightly differently each time, so the piece never sounds exactly the same.’
Agents composing together
The audience at Amare, including his own PhD committee, watched in fascination, and had lots of questions afterwards. Everyone wanted to know his feelings about working with AI. Lukawski admitted that AI fundamentally changes a composer’s work. ‘The composer is no longer the sole source – the archetype of the musical genius – but a participant contributing material to a system in which multiple agents compose together.’
But, he was keen to add, that doesn’t mean he lost control. ‘I choose the samples that sound beautiful to me, and the resulting pieces end up sounding surprisingly similar to my own work. AI amplifies me.’