Lisa Zahray · RO-MAN 2020
Robot Gesture Sonification to Enhance Awareness of Robot Status and Enjoyment of Interaction
We present a divergent approach to robotic sonification with the goal of improving the quality and safety of human-robot interactions. Sonification (turning data into sound) has been underutilized in robotics, and has broad potential to convey robotic movement and intentions to users without requiring visual engagement. We design and evaluate six different sonifications of movements for a robot with four degrees of freedom. Our sonification techniques include a direct mapping from each degree of freedom to pitch and timbre changes, emotion-based sound mappings, and velocity-based mappings using different types of sounds such as motors and music. We evaluate these sonifications using metrics for ease of use, enjoyment/appeal, and conveyance of movement information. Based on our results, we make recommendations to inform decisions for future robot sonification design. We suggest that when using sonification to improve safety of human-robot collaboration, it is necessary not only to convey sufficient information about movements, but also to convey that information in a pleasing and even social way to enhance the human-robot relationship.
Multiple speakers · ICCC'20