We are currently working on a museum guide robot with an emphasis on "friendly" human-robot interaction displayed through nonverbal behaviors. In this paper, we focus on head gestures during explanations of exhibits. The outline of our research is as follows. We first examined human head gestures through an experimental, sociological approach. From this research, we have discovered how human guides coordinate their head movement along with their talk when explaining exhibits. Second, we developed a robot system based on these findings. Third, we evaluated human-robot interaction, again using an experimental, sociological approach, and then modified the robot based on the results. Our experimental results suggest that robot head turning may lead to heightened engagement of museum visitors with the robot. Based on our preliminary findings, we will describe a museum guide robot that first works autonomously and, if necessary, can turn into remote-control mode operated by a huma...