When we recognize objects, multiple sensory information (e.g., visual, auditory, and haptic) is used with fusion. For example, both eyes and hands provide relevant information about an object’s shape. We investigate how sensory stimuli interact with each other. For that purpose, we developed a system that gives haptic/visual sensory fusion using a mixed reality technique. Our experiments show that the haptic stimulus seems to be affected by visual stimulus when a discrepancy exists between vision and haptic stimuli.