In the context of a larger project dealing with kansei analysis of movement, we present a basic method for applying real-time filters to human motion capture data in order to modify the perceived emotional affect of the movement. By employing a commercial realtime 3D package, we have been able to quickly prototype some interfaces to an as yet non-existent system. Filters are represented as physical objects whose proximity to an animated dancing human figure determine how much they modify the movement.