Sciweavers

ICPR
2004
IEEE

Hand Gesture Recognition: Self-Organising Maps as a Graphical User Interface for the Partitioning of Large Training Data Sets

15 years 16 days ago
Hand Gesture Recognition: Self-Organising Maps as a Graphical User Interface for the Partitioning of Large Training Data Sets
Gesture recognition is a difficult task in computer vision due to the numerous degrees of freedom of a human hand. Fortunately, human gesture covers only a small part of the theoretical "configuration space" of a hand, so an appearance based representation of human gesture becomes tractable. A major problem, however, is the acquisition of appropriate labelled image data from which an appearance based representation can be built. In this paper we apply self-organising maps for a visualisation of large amounts of segmented hands performing pointing gestures. Using a graphical interface, an easy labelling of the data set is facilitated. The labelled set is used to train a neural classification system, which is itself embedded in a larger architecture for the recognition of gestural reference to objects.
Axel Saalbach, Gunther Heidemann, Holger Bekel, In
Added 09 Nov 2009
Updated 09 Nov 2009
Type Conference
Year 2004
Where ICPR
Authors Axel Saalbach, Gunther Heidemann, Holger Bekel, Ingo Bax
Comments (0)