Sciweavers

AIIDE
2007

Learning a Table Soccer Robot a New Action Sequence by Observing and Imitating

13 years 9 months ago
Learning a Table Soccer Robot a New Action Sequence by Observing and Imitating
Star-Kick is a commercially available and fully automatic table soccer (foosball) robot, which plays table soccer games against human players on a competitive level. One of our research goals is to learn this table soccer robot skillful actions similar to a human player based on a moderate number of trials. Two independent learning algorithms are employed for learning a new lock and slide-kick action sequence by observing the performed actions and imitating the relative actions of a human player. The experiments with Star-Kick show that an effective action sequence can be learned in approximately 20 trials.
Dapeng Zhang 0002, Bernhard Nebel
Added 02 Oct 2010
Updated 02 Oct 2010
Type Conference
Year 2007
Where AIIDE
Authors Dapeng Zhang 0002, Bernhard Nebel
Comments (0)