Abstract. Using Atomic Force Microscopes (AFM) to manipulate nanoobjects is an actual challenge for surface scientists. Basic haptic interfaces between the AFM and experimentalists have already been implemented. The multi-sensory renderings (seeing, hearing and feeling) studied from a cognitive point of view increase the efficiency of the actual interfaces. To allow the experimentalist to feel and touch the nano-world, we add mixed realities between an AFM and a force feedback device, enriching thus the direct connection by a modeling engine. We present in this paper the first results from a real-time remote-control handling of an AFM by our Force Feedback Gestural Device through the example of the approach-retract curve.