Abstract— Robust robotic manipulation and perception remains a difficult challenge, in particular in unstructured environments. To address this challenge, we propose to couple manipulation and perception. The robot observes its own deliberate interactions with the world. These interactions reveal sensory information that would otherwise remain hidden and facilitate the interpretation of perceptual data. To demonstrate the effectiveness of interactive perception we present a skill for the manipulation of articulated objects. We show how UMan, our mobile manipulation platform, obtains a kinematic model of an unknown object. The model then enables the robot to perform purposeful manipulation. Our algorithm is extremely robust, and does not require prior knowledge of the object; it is insensitive to lighting, texture, color, specularities, background, and is computationally highly efficient.