This paper presents a new method for rendering views, especially those of large-scale scenes, such as broad city landscapes. The main contribution of our method is that we are able to easily render any view from an arbitrary point to an arbitrary direction on the ground in a virtual environment. Our method belongs to the family of work that employs plenoptic functions; however, unlike other works of this type, this particular method allows us to render a novel view from almost any point on the plane at which images are taken. Previous methods, on the other hand, have some restraints concerning their re-constructable area. Thus, when synthesizing a large-scale virtual environment such as a city, our method has a great advantage. One of the applications of our method is a driving simulator in the ITS domain. We can generate any view on any lane on the road from images taken by running along just one lane. Our method, using an omni-directional camera or a measuring device of a similar ty...