We describe the implementation of an interaction technique which allows users to store and retrieve information and computational functionality on different parts of their body. We present a dynamic systems approach to gestural interaction using Dynamic Movement Primitives, which model a gesture as a second order dynamic system followed by a learned nonlinear transformation. We demonstrate that it is possible to learn models, even from single examples, which can simulate and classify the gestures needed for the Body Space project, running on a PocketPC with a 3-degree of freedom linear accelerometer.