AI in Real Time
A celebration of the dynamic world of AI in real time, focusing on music improvisation and the joys of co-creation. By David De Roure.
It’s exciting to see AI being used creatively – for the computer to generate realistic outputs, but also to be part of a co-creative process with the human responding creatively to the AI.
Many of today’s examples incorporate AI-generated content into the creative workflow of the practitioner. For instance, I’ve been involved in several music composition projects where the AI is trained on a collection of content, then generates suggestions which the composer selects and assembles. And AI training can take hours or days, learning from hundreds of thousands of examples, giving a sense that the time dimension of AI is sometimes slow.
But the AI can itself be something that is dynamic, responsive and evolving – active in “real-time”, and interactive with the human. My favourite example of this is music improvisation, where the artist and AI interact together “in the moment”. This is about co-creation in the act of performance. And this matters, because it’s an insight into our future lives – as humans interact routinely with the AI deployed pervasively around them, in “smart” everything. We will all be “living in the moment” with the AI, so let’s explore that now, and what it means to be a creative human in the face of automation.
And there are certainly things to explore – not just real-time AI but multiple AIs interacting, and interacting fast. Already we adopt approaches where one AI generates and another discriminates at speed, algorithm versus algorithm to beneficial outcome. Imagine a music performance with multiple AI and human musicians, but also AI audience members and critics – an evolving, dynamic, interactive, co-creative system. What will emerge? What are the feedback loops that guide its progress?
Not only art, but the process of creating art, brings insights into our hybrid human-machine future. It’s about being creative about being co-creative.