We propose a two-stage recognition system for detecting arm gestures related to typical meal intake. Information retrieved from such a system can be used for automatic dietary monitoring in the domain of behavioural medicine. We demonstrate that arm gestures can be clustered and detected using inertial sensors. Such sensors can be integrated unobtrusively into normal clothing. To validate our method, experimental results including 384 gestures from two subjects are presented. Using an isolated discrimination based on HMMs an accuracy of 95% can be achieved. When spotting the gestures in continous movement data, an accuracy of of up to 87% is reached.