Author ORCID Identifier
Available under a Creative Commons Attribution Non-Commercial Share Alike 4.0 International Licence
Statistics, Computer Sciences, Robotics and automatic control, Communication engineering and systems
Our research seeks to develop a long-lasting and high-quality en- gagement between the user and the social robot, which in turn requires a more sophisticated alignment of the user and the system than is currently commonly available. Close monitoring of inter- locutors’ states, and we argue their confusion state in particular, and adjusting dialogue policies based on this state of confusion is needed for successful joint activity. In this paper, we present an ini- tial study of a human-robot conversation scenarios using a Pepper robot to investigate the confusion states of users. A Wizard-of-Oz (WoZ) HRI experiment is illustrated in detail with stimuli strategies to trigger confused states from interlocutors. For the collected data, we estimated emotions, head pose, and eye gaze, and these features were analysed against the silence duration time of the speech data and the post-study self-reported confusion states that are reported by participants. Our analysis found a significant relationship be- tween confusion states and most of these features. We see these results as being particularly significant for multimodal situated dialogues for human-robot interaction and beyond.
Na Li and Robert Ross. 2023. Hmm, You Seem Confused! Tracking Interlocutor Confusion for Situated Task-Oriented HRI. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), March 13–16, 2023, Stockholm, Sweden. ACM, New York, NY, USA, 11 pages. DOI: 10.1145/3568162.3576999
Science Foundation Ireland
18th Annual ACM/IEEE International Conference on Human Robot Interaction (HRI), 2023