The Face Says it All: Gaze and Emotional Behaviors of Social Robots

Wednesday 17 April 2024, 4:30 pm
PhD student
C. Mishra MSc.
Promotor(s)
prof. dr. P. Hagoort, prof. dr. G. Skantze
Co-promotor(s)
dr. S. Fuchs, dr. R. Verdonschot
Location
Aula

With the rapid advancements in artificial intelligence and robotics technologies, social robots are poised to have greater integration in society. These robots are designed specifically to conduct human-like interactions. So, understanding and replicating essential non-verbal cues, such as facial expressions and gaze, are essential for enhancing the effectiveness, human-likeness, and acceptance of these robotic systems. Social robots are already being employed in a variety of domains, including healthcare, education, and assistive roles, where their capacity to convey and interpret human emotions and intentions can significantly impact the quality of interactions. Modeling non-verbal behaviors on these robots would make them more capable of providing a richer user experience. For example, a social robot designed to provide companionship to the elderly could express happiness when the user is cheerful and sadness when the user is upset, enhancing emotional connection, or look directly at the user when speaking, creating a more personalized and attentive interaction.

This research investigates methods for making human-robot interactions (HRI) more seamless and human-like by modeling non-verbal behaviors on social robots and is centered on two key areas:
- Developing architectures to model the eye gaze and emotional
behaviors of social robots.
- Evaluating the human perception and influence of these
behaviors during HRI


I am a Postdoctoral Researcher in the Multimodal Language Department at the Max Planck Institute for Psycholinguistics working with Prof. Asli Özyürek and Dr. Judith Holler. I recently concluded my PhD research at Furhat Robotics as a Marie Curie PhD Fellow ( COBRA MSCA ITN) under the supervision of Prof. Gabriel Skantze and Prof. Peter Hagoort, where I investigated ways to automate the gaze and affective behaviors of Social Robots and gauge the influence of these robot behaviors on Human-Robot Interaction (HRI).
My research uses an interdisciplinary approach by combining insights from psychology, psycholinguistics, artificial intelligence, and cognitive science in finding novel solutions to model/ automate robot behaviors that are needed to facilitate a seamless HRI. The broader goal that I pursue through my research is to make robots easier to be integrated into our society which enable us to be more efficient and comfortable.