Time filter

Source Type

Shiomi M.,ATR Intelligent Robotics and Communication Laboratory | Nakagawa K.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
Interaction Studies | Year: 2013

A change of gaze behavior at a small mistake moment is a natural response that reveals our own mistakes and suggests an apology to others with whom we are working or interacting. In this paper we investigate how robot gaze behaviors at small mistake moments change the impressions of others. To prepare gaze behaviors for a robot, first, we identified by questionnaires how human gaze behaviors change in such situations and extracted three kinds: looking at the other, looking down, and looking away. We prepared each gaze behavior, added a no-gaze behavior, and investigated how a robot's gaze behavior at a small mistake moment changes the impressions of the interacting people in a simple cooperative task. Experiment results show that the 'looking at the other' gaze behavior outperforms the other gaze behaviors and indicates the degrees of perceived apologeticness and friendliness. © 2013 John Benjamins Publishing Company.


Kanda T.,ATR Intelligent Robotics and Communication Laboratory
AAAI Fall Symposium - Technical Report | Year: 2010

This paper summarizes our previous works in modeling non-verbal behaviors for natural human-robot interaction (HRI) and discusses a path for integrating them into spoken dialogs. While some non-verbal behaviors can be considered "optional" elements to be added to a spoken dialog, some non-verbal behaviors substantially require a harmonized plan that simultaneously considers both spoken dialog and non-verbal behavior. The paper discusses such unique HRI features.


Matsumoto T.,ATR Intelligent Robotics and Communication Laboratory | Satake S.,ATR Intelligent Robotics and Communication Laboratory | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Imai M.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
HRI'12 - Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction | Year: 2012

We aim to develop a shopping companion robot that can share experience with users. In this study, we focused on the shared memory acquired when a robot walks together with a user. We developed a computational model of memory recall of visited locations in a shopping mall. The model was developed with data collection from 30 participants. We found that shop size, color intensity of facade, relative visibility, and time elapsed are the influencing features for recall. The model was used in a scenario of a shopping companion robot. The robot, Robovie, autonomously follows a user while inferring the user's memory recall of shops in the visited route. When the user asks the location of other shops, Robovie replied with destination description, referring to the known locations inferred with the model of the user's memory recall. With this scenario, we verified the effectiveness of the developed computational model of memory recall. The evaluation experiment revealed that the model outputs shops that the participants are likely to recall, and makes the directions given easier to understand. © 2012 ACM.


Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Shimada M.,ATR Intelligent Robotics and Communication Laboratory | Koizumi S.,ATR Intelligent Robotics and Communication Laboratory
HRI'12 - Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction | Year: 2012

We used a social robot as a teaching assistant in a class for children's collaborative learning. In the class, a group of 6th graders learned together using Lego Mindstorms. The class consisted of seven lessons with Robovie, a social robot, followed by one lesson to test their individual achievement. Robovie managed the class and explained how to use Lego Mindstorms. In addition to such basic management behaviors for the class, we prepared social behaviors for building relationships with the children and encouraging them. The result shows that the social behavior encouraged children to work more in the first two lessons, but did not affect them in later lessons. On the other hand, social behavior contributed to building relationships and attaining better social acceptance. © 2012 ACM.


Sabelli A.M.,University of Hawaii at Manoa | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
HRI 2011 - Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction | Year: 2011

This paper reports an ethnographic study on the use of a conversational robot. We placed a robot for 3.5 months in an elderly care center. Assuming a real deployment scenario, the robot was managed by a single non-programmer person during the field trial, who teleoperated the robot and updated the contents. The robot was designed to engage in daily greetings and chatting with elderly people. Through the ethnographic approach, we clarified how the elderly people interacted with this conversational robot, how the deployment process adopted to introduce the robot was designed, and how the organization's personnel involved themselves in this deployment. Copyright 2011 ACM.


Hato Y.,ATR Intelligent Robotics and Communication Laboratory | Satake S.,ATR Intelligent Robotics and Communication Laboratory | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Imai M.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
5th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2010 | Year: 2010

In daily conversation, we sometimes observe a deictic interaction scene that refers to a region in a space, such as saying "please put it over there" with pointing. How can such an interaction be possible with a robot? Is it enough to simulate people's behaviors, such as utterance and pointing? Instead, we highlight the importance of simulating human cognition. In the first part of our study, we empirically demonstrate the importance of simulating human cognition of regions when a robot engages in a deictic interaction by referring to a region in a space. The experiments indicate that a robot with simulated cognition of regions improves efficiency of its deictic interaction. In the second part, we present a method for a robot to computationally simulate cognition of regions. © 2010 IEEE.


Shiomi M.,ATR Intelligent Robotics and Communication Laboratory | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Ishiguro H.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
5th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2010 | Year: 2010

Tour guidance is a common task of social robots. Such a robot must be able to encourage the participation of people who are not directly interacting with it. We are particularly interested in encouraging people to overhear its interaction with others, since it has often been observed that even people who hesitate to interact with a robot are willing to observe its activity. To encourage such participation as bystanders, we developed a robot that walks backwards based on observations of human tour guides. Our developed system uses a robust human tracking system that enables a robot to guide people by walking forward/backward and allows us to scrutinize people's behavior after the experiment. We conducted a field experiment to compare the ratios of overhearing in "walking forward" and "walking backward." The experimental results revealed that in fact people do more often overhear the robot's conversation in the "walking backward" condition. © 2010 IEEE.


Ishi C.T.,ATR Intelligent Robotics and Communication Laboratory | Ishiguro H.,ATR Hiroshi Ishiguro Special Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
Speech Communication | Year: 2014

Head motion naturally occurs in synchrony with speech and may convey paralinguistic information (such as intentions, attitudes and emotions) in dialogue communication. With the aim of verifying the relationship between head motion events and speech utterances, analyses were conducted on motion-captured data of multiple speakers during spontaneous dialogue conversations. The relationship between head motion events and dialogue acts was firstly analyzed. Among the head motion types, nods occurred with most frequency during speech utterances, not only for expressing dialogue acts of agreement or affirmation, but also appearing at the end of phrases with strong boundaries (including both turn-keeping and giving dialogue act functions). Head shakes usually appeared for expressing negation, while head tilts appeared mostly in interjections expressing denial, and in phrases with weak boundaries, where the speaker is thinking or did not finish uttering. The synchronization of head motion events and speech was also analyzed with focus on the timing of nods relative to the last syllable of a phrase. Results showed that nods were highly synchronized with the center portion of backchannels, while it was more synchronized with the end portion of the last syllable in phrases with strong boundaries. Speaker variability analyses indicated that the inter-personal relationship with the interlocutor is one factor influencing the frequency of head motion events. It was found that the frequency of nods was lower for dialogue partners with close relationship (such as family members), where speakers do not have to express careful attitudes. On the other hand, the frequency of nods (especially of multiple nods) clearly increased when the inter-personal relationship between the dialogue partners was distant. © 2013 Elsevier B.V. All rights reserved.


Iwamura Y.,ATR Intelligent Robotics and Communication Laboratory | Shiomi M.,ATR Intelligent Robotics and Communication Laboratory | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | Ishiguro H.,ATR Intelligent Robotics and Communication Laboratory | Hagita N.,ATR Intelligent Robotics and Communication Laboratory
HRI 2011 - Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction | Year: 2011

Assistive robots can be perceived in two main ways: tools or partners. In past research, assistive robots that offer physical assistance for the elderly are often designed in the context of a tool metaphor. This paper investigates the effect of two design considerations for assistive robots in a partner metaphor: conversation and robot-type. The former factor is concerned with whether robots should converse with people even if the conversation is not germane for completing the task. The latter factor is concerned with whether people prefer a communication/function oriented design for assistive robots. To test these design considerations, we selected a shopping assistance situation where a robot carries a shopping basket for elderly people, which is one typical scenario used for assistive robots. A field experiment was conducted in a real supermarket in Japan where 24 elderly participants shopped with robots. The experimental results revealed that they prefer a conversational humanoid as a shopping assistant partner. Copyright 2011 ACM.


Kitade T.,ATR Intelligent Robotics and Communication Laboratory | Kitade T.,Keio University | Satake S.,ATR Intelligent Robotics and Communication Laboratory | Kanda T.,ATR Intelligent Robotics and Communication Laboratory | And 2 more authors.
ACM/IEEE International Conference on Human-Robot Interaction | Year: 2013

This study addresses the robot that waits for users while they shop. In order to wait, the robot needs to understand which locations are appropriate for waiting. We investigated how people choose locations for waiting, and revealed that they are concerned with 'disturbing pedestrians' and 'disturbing shop activities'. Using these criteria, we developed a classifier of waiting locations. 'Disturbing pedestrians' are estimated from statistics of pedestrian trajectories, which is observed with a human-tracking system based on laser range finders. 'Disturbing shop activities' are estimated based on shop visibility. We evaluated this autonomous waiting behavior in a shopping-assist scenario. The experimental results revealed that users found the autonomous waiting robot chose appropriate waiting locations for waiting more than a robot with random choice or one controlled manually by the user him or herself. © 2013 IEEE.

Loading ATR Intelligent Robotics and Communication Laboratory collaborators
Loading ATR Intelligent Robotics and Communication Laboratory collaborators