This article presents a contribution to the visual tracking of objects using all the degrees of freedom of an Aibo ERS-7 robot. We approach this issue in a principled way applying ideas of visual servoing. State of the art visual tracking solutions for this kind of robots inspired in the visual servoing approach either are restricted to the head effectors or they apply an inductive learning from experimental data approach to build up the kinematics matrix. In this work we take into account all the effectors which can affect the extrinsic parameters of the robot camera, and therefore in the captured image. We construct the robot kinematic matrix from its description. Visual servoing is performed computing the seudoinverse of this matrix.