We present a new selection technique that facilitates the use of natural hand gestures for virtual object manipulation in 3D. Our method supports the use of 3D imaging techniques for tracking the user’s body and therefore it does not require the use of any hand held devices that would restrict the manipulative capabilities of the user’s hands. The key contribution of our work is the novel use of characteristic behavioral cues, which are representative for general goal directed movement, to infer the object targeted by the user during selection. The resulting technique enables us to select objects whose largest dimension is smaller than the sensing resolution of our system in spite of body tracking uncertainties and hand placement faults. Furthermore, by means of intention inference, our method automatically adapts to the user’s subjective need for variable levels of tolerance to hand placement faults, jitter, or tracking noise.