Sciweavers

ICCV
2003
IEEE

Polarization-based Inverse Rendering from a Single View

15 years 1 months ago
Polarization-based Inverse Rendering from a Single View
This paper presents a method to estimate geometrical, photometrical, and environmental information of a singleviewed object in one integrated framework under fixed viewing position and fixed illumination direction. These three types of information are important to render a photorealistic image of a real object. Photometrical information represents the texture and the surface roughness of an object, while geometrical and environmental information represent the 3D shape of an object and the illumination distribution, respectively. The proposed method estimates the 3D shape by computing the surface normal from polarization data, calculates the texture of the object from the diffuse only reflection component, determines the illumination directions from the position of the brightest intensity in the specular reflection component, and finally computes the surface roughness of the object by using the estimated illumination distribution.
Daisuke Miyazaki, Robby T. Tan, Kenji Hara, Katsus
Added 15 Oct 2009
Updated 15 Oct 2009
Type Conference
Year 2003
Where ICCV
Authors Daisuke Miyazaki, Robby T. Tan, Kenji Hara, Katsushi Ikeuchi
Comments (0)