21.50 – 22.05 uur | Café
Dancing to music when you can't keep a beat is quite a challenge. But what if the beat could adapt to your movements? This is not as futuristic as it sounds. Or imagine hearing music somewhere that you don't like, and you could adjust its tempo, mood and intensity just by making gestures with your hands. Thanks to Artificial Intelligence, this is possible! Join the Soundbite by Artificial Intelligence researchers Michelle Appel and Daniel Danilin en learn how Artificial Intelligence can be used to control music through movement.
In the foyer you will find the AI installation of Michelle and Daniel and you can give it a try yourself.
This Soundbite will be in English.
About the speakers
Michelle Appel is a PhD candidate at the Donders Institute for Brain and Cognition. She researches computer vision techniques for visual brain prostheses: these are brain chips that ensure that blind people regain their sight. In addition, she is involved in research into closing the gap between simulation and reality by using simulation-based image segmentation. In her spare time she produces music under the alias MiesFM.
Daniel Danilin is an Artificial Intelligence student at Radboud University. In 2022, he and Michelle Appel were part of a team competing in the AI Song Contest with their song ‘Upfall’.
This program is part of Radboud Sounds. The festival where science meets music with lectures, dance, live music and more on Friday 12 May 2023 in Doornroosje, Nijmegen. Come and celebrate Radboud University’s 100th anniversary. Radboud University scientists from various disciplines shed their light on music and science. Come and listen, dance, enjoy and sharpen your mind on the effect of music. Compose your own program and see the world of music from a different viewpoint.