Modeling affected user behavior during human-machine interaction


Bogdan Vlasenko, Ronald Böck and Andreas Wendemuth, Cognitive Systems, IESK, Otto-von-Guericke University, Magdeburg, Germany

Spoken human-machine interaction supported by state-of-the-art dialog systems is becoming a standard technology. A lot of effort has been invested for this kind of artificial communication interface. But still the spoken dialog systems (SDS) are not able to provide to the users a natural way of communication. Most part of the existing automated dialog systems is based on a questionnaire based strategy with sentence by sentence confirmation request. This paper addresses aspects of design and implementation of user behavior models in dialog systems for frustration detection and user intention recognition, aimed to provide naturalness of human-machine interaction. We overview our acoustic emotion classification, robust affected automatic speech recognition (ASR) and user emotion correlated dialog management. A multimodal human-machine interaction system with integrated user behavior model is created within the project "Neurobiologically Inspired, Multimodal Intention Recognition for Technical Communication Systems" (NIMITEK). Currently the NIMITEK demonstration system provides a technical demonstrator to study user behavior modeling principles in a dedicated task, namely solving the game ``Towers of Hanoi''. During communication with our demonstration system users are free to use natural language. By using natural language understanding and intention recognition modules the system provides task control management. To show the Spoken Dialog System performance uprating with the user's behavior correlated dialog management we present results of the NIMITEK demonstrator's usability test. After having analyzed the results of the usability test, we find out that our system provides more cooperative computer machine interaction and decreases interaction time required to complete the puzzle.