Sciweavers

ICONIP
2007

Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons

14 years 18 days ago
Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons
We propose a technique to make a robot execute free and solitary dance movements on music, in a manner which simulates the dynamic alternations between synchronisation and autonomy typically observed in human behaviour. In contrast with previous approaches, we preprogram neither the dance patterns nor their alternation, but rather build in basic dynamics in the robot, and let the behaviour emerge in a seemingly autonomous manner. The robot motor commands are generated in real-time by converting the output of a neural network processing a sequence of pulses corresponding to the beats of the music being danced to. The spiking behaviour of individual neurons is controlled by a biologically-inspired model (FitzHugh-Nagumo). Under appropriate parameters, the network generates chaotic itinerant behaviour among low-dimensional local attractors. A robot controlled this way exhibits a variety of motion styles, some being periodic and strongly coupled to the musical rhythm and others being more ...
Jean-Julien Aucouturier, Yuta Ogai, Takashi Ikegam
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2007
Where ICONIP
Authors Jean-Julien Aucouturier, Yuta Ogai, Takashi Ikegami
Comments (0)