Data related to the coordination and modulation between visual information, gaze direction and arm reaching movements in primates are analyzed from a computational point of view. The goal of the analysis is to construct a model of the mechanisms that allow humans and other primates to build dynamical representations of their peripersonal space through active interaction with nearby objects. The application of the model to robotic systems will allow artificial agents to improve their skills in their exploration of the nearby space.