In this paper, we outline the design of a visual support system that provides three-dimensional visual information using three-dimensional virtual sound. Three-dimensional information is obtained by analyzing images captured by stereo cameras, and recognizing the objects needed by the visually impaired user. Using the three-dimensional virtual acoustic display, which relies on Head Related Transfer Functions, the user is informed of the locations and movements of objects. The user's auditory sense is not impeded as we use a bone conduction headset, which does not block out environmental sound. The proposed system is expected to be useful in the situation where the infrastructure is incomplete, and when the situation changes in realtime. We plan to experiment with it, for example, to guide users in walking and playing sports.