Entity

Time filter

Source Type


Kanda T.,ATR Intelligent Robotics and Communication Laboratory
AAAI Fall Symposium - Technical Report | Year: 2010

This paper summarizes our previous works in modeling non-verbal behaviors for natural human-robot interaction (HRI) and discusses a path for integrating them into spoken dialogs. While some non-verbal behaviors can be considered "optional" elements to be added to a spoken dialog, some non-verbal behaviors substantially require a harmonized plan that simultaneously considers both spoken dialog and non-verbal behavior. The paper discusses such unique HRI features. Source


Ishi C.T.,ATR Intelligent Robotics and Communication Laboratory | Ishiguro H.,ATR Hiroshi Ishiguro Special Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
Speech Communication | Year: 2014

Head motion naturally occurs in synchrony with speech and may convey paralinguistic information (such as intentions, attitudes and emotions) in dialogue communication. With the aim of verifying the relationship between head motion events and speech utterances, analyses were conducted on motion-captured data of multiple speakers during spontaneous dialogue conversations. The relationship between head motion events and dialogue acts was firstly analyzed. Among the head motion types, nods occurred with most frequency during speech utterances, not only for expressing dialogue acts of agreement or affirmation, but also appearing at the end of phrases with strong boundaries (including both turn-keeping and giving dialogue act functions). Head shakes usually appeared for expressing negation, while head tilts appeared mostly in interjections expressing denial, and in phrases with weak boundaries, where the speaker is thinking or did not finish uttering. The synchronization of head motion events and speech was also analyzed with focus on the timing of nods relative to the last syllable of a phrase. Results showed that nods were highly synchronized with the center portion of backchannels, while it was more synchronized with the end portion of the last syllable in phrases with strong boundaries. Speaker variability analyses indicated that the inter-personal relationship with the interlocutor is one factor influencing the frequency of head motion events. It was found that the frequency of nods was lower for dialogue partners with close relationship (such as family members), where speakers do not have to express careful attitudes. On the other hand, the frequency of nods (especially of multiple nods) clearly increased when the inter-personal relationship between the dialogue partners was distant. © 2013 Elsevier B.V. All rights reserved. Source


Sabelli A.M.,University of Hawaii at Manoa | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
HRI 2011 - Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction | Year: 2011

This paper reports an ethnographic study on the use of a conversational robot. We placed a robot for 3.5 months in an elderly care center. Assuming a real deployment scenario, the robot was managed by a single non-programmer person during the field trial, who teleoperated the robot and updated the contents. The robot was designed to engage in daily greetings and chatting with elderly people. Through the ethnographic approach, we clarified how the elderly people interacted with this conversational robot, how the deployment process adopted to introduce the robot was designed, and how the organization's personnel involved themselves in this deployment. Copyright 2011 ACM. Source


Shiomi M.,ATR Intelligent Robotics and Communication Laboratory | Nakagawa K.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
Interaction Studies | Year: 2013

A change of gaze behavior at a small mistake moment is a natural response that reveals our own mistakes and suggests an apology to others with whom we are working or interacting. In this paper we investigate how robot gaze behaviors at small mistake moments change the impressions of others. To prepare gaze behaviors for a robot, first, we identified by questionnaires how human gaze behaviors change in such situations and extracted three kinds: looking at the other, looking down, and looking away. We prepared each gaze behavior, added a no-gaze behavior, and investigated how a robot's gaze behavior at a small mistake moment changes the impressions of the interacting people in a simple cooperative task. Experiment results show that the 'looking at the other' gaze behavior outperforms the other gaze behaviors and indicates the degrees of perceived apologeticness and friendliness. © 2013 John Benjamins Publishing Company. Source


Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Shimada M.,ATR Intelligent Robotics and Communication Laboratory | Koizumi S.,ATR Intelligent Robotics and Communication Laboratory
HRI'12 - Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction | Year: 2012

We used a social robot as a teaching assistant in a class for children's collaborative learning. In the class, a group of 6th graders learned together using Lego Mindstorms. The class consisted of seven lessons with Robovie, a social robot, followed by one lesson to test their individual achievement. Robovie managed the class and explained how to use Lego Mindstorms. In addition to such basic management behaviors for the class, we prepared social behaviors for building relationships with the children and encouraging them. The result shows that the social behavior encouraged children to work more in the first two lessons, but did not affect them in later lessons. On the other hand, social behavior contributed to building relationships and attaining better social acceptance. © 2012 ACM. Source

Discover hidden collaborations