A Robot Model That Obeys a Norm of a Human Group by Participating in the Group and Interacting with Its Members

Yotaro FUSE  Hiroshi TAKENOUCHI  Masataka TOKUMARU  

IEICE TRANSACTIONS on Information and Systems   Vol.E102-D   No.1   pp.185-194
Publication Date: 2019/01/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.2018EDP7077
Type of Manuscript: PAPER
Category: Kansei Information Processing, Affective Information Processing
social robot,  group norm,  reinforcement learning,  human-robot interaction,  

Full Text: PDF(1.5MB)
>>Buy this Article

Herein, we proposed a robot model that will obey a norm of a certain group by interacting with the group members. Using this model, a robot system learns the norm of the group as a group member itself. The people with individual differences form a group and a characteristic norm that reflects the group members' personalities. When robots join a group that includes humans, the robots need to obey a characteristic norm: a group norm. We investigated whether the robot system generates a decision-making criterion to obey group norms by learning from interactions through reinforcement learning. In this experiment, human group members and the robot system answer same easy quizzes that could have several vague answers. When the group members answered differently from one another at first, we investigated whether the group members answered the quizzes while considering the group norm. To avoid bias toward the system's answers, one of the participants in a group only obeys the system, whereas the other participants are unaware of the system. Our experiments revealed that the group comprising the participants and the robot system forms group norms. The proposed model enables a social robot to make decisions socially in order to adjust their behaviors to common sense not only in a large human society but also in partial human groups, e.g., local communities. Therefore, we presumed that these robots can join human groups by interacting with its members. To adapt to these groups, these robots adjust their own behaviors. However, further studies are required to reveal whether the robots' answers affect people and whether the participants can form a group norm based on a robot's answer even in a situation wherein the participants recognize that they are interacting in a group that include a real robot. Moreover, some participants in a group do not know that the other participant only obeys the system's decisions and pretends to answer questions to prevent biased answers.