This contribution presents our approach for an instrumented automatic gesture recognition system for use in Augmented Reality, which is able to differentiate static and dynamic gestures. Basing on an infrared tracking system, infrared targets mounted at the users thumbs and index fingers are used to retrieve information about position and orientation of each finger. Our system receives this information and extracts static gestures by distance classifiers and dynamic gestures by statistical models. The concluded gesture is provided to any connected application. We introduce a small demonstration as basis for a short evaluation. In this we compare interaction in a real environment, Augmented Reality with a mouse/keyboard, and our gesture recognition system concerning properties, such as task execution time or intuitiveness of interaction. The results show that tasks executed by interaction with our gesture recognition system are faster than using the mouse/keyboard. However, this enhance...