Sciweavers

ICRA
2003
IEEE

Generating whole body motions for a biped humanoid robot from captured human dances

14 years 4 months ago
Generating whole body motions for a biped humanoid robot from captured human dances
— The goal of this study is a system for a robot to imitate human dances. This paper describes the process to generate whole body motions which can be performed by an actual biped humanoid robot. Human dance motions are acquired through a motion capturing system. We then extract symbolic representation which is made up of primitive motions: essential postures in arm motions and step primitives in leg motions. A joint angle sequence of the robot is generated according to these primitive motions. Then joint angles are modified to satisfy mechanical constraints of the robot. For balance control, the waist trajectory is moved to acquire dynamics consistency based on desired ZMP. The generated motion is tested on OpenHRP dynamics simulator. In our test, the Japanese folk dance, ’Jongara-bushi’ was successfully performed by HRP-1S.
Shinichiro Nakaoka, Atsushi Nakazawa, Kazuhito Yok
Added 04 Jul 2010
Updated 04 Jul 2010
Type Conference
Year 2003
Where ICRA
Authors Shinichiro Nakaoka, Atsushi Nakazawa, Kazuhito Yokoi, Hirohisa Hirukawa, Katsushi Ikeuchi
Comments (0)