The paper discusses a panoramic vision system for autonomous navigation purposes. It describes a method for integrating data in real-time from multiple camera sources. The views from adjacent cameras are visualized together as a panorama of the scene using a modified correlation based stitching algorithm. A separate operator is presented with a particular slice of the panorama matching the user's viewing direction. Additionally, a simulated environment is created where the operator can choose to augment the video by simultaneously viewing an artificial 3D view of the scene. Potential applications of this system include enhancing quality and range of visual cues, navigation under hostile circumstances where direct view of the environment is not possible or desirable.