AI and music: music that adapts to you!
As part of the celebration of its 100th anniversary, Radboud University organises Radboud Sounds, a festial about the science of music with lectures, dance, live music and more. AI researchers Michelle Appel and Daniel Danilin are involved in a project that is part of Radboud Sounds in which Artificial Intelligence is used to make music interactive.
Dancing to music when you can't keep a beat is quite a challenge. But what if the beat could adapt to your movements? Or imagine hearing music somewhere that you aren't into, but you could change the tempo, mood, and intensity of a piece of music just by making gestures with your hands. It's all possible with the use of Artificial Intelligence!
Michelle Appel is a PhD candidate at the Donders Institute for Brain and Cognition. She researches computer vision techniques for visual brain prostheses: these are brain chips that enable blind people to regain their sight. In addition, she is involved in research into closing the gap between simulation and reality by using simulation-based image segmentation.
Daniel Danilin is a student of Artificial Intelligence at Radboud University. In 2022, he and Michelle were part of a team that entered the AI Song Contest with their song 'Upfall'.
Their AI-installation can be found at the Radboud Sounds festival on May 12th at Doornroosje. To learn more about the event and to get your tickets, click here!