In this paper, we present a new method to recover an approximation of the bidirectional reflectance distribution function (BRDF) of the surfaces present in a real or synthetic scene. This is done from a single photograph and a 3D geometric model of the scene. The result is a full model of the reflectance properties of all surfaces, which can be rendered under novel illumination conditions with, for example, viewpoint modification and the addition of new synthetic objects. Our technique produces a reflectance model using a small number of parameters. These parameters nevertheless approximate the BRDF and allow the recovery of the photometric properties of diffuse, specular, isotropic or anisotropic textured objects. The input data are a geometric model of the scene including the light source positions and the camera properties, and a single captured image. We present several synthetic images that are compared to the original ones, and some possible applications in augmented reality such as novel lighting conditions and addition of synthetic objects.
Samuel Boivin, Andre Gagalowicz, "Inverse Rendering from a Single Image" in Proc. IS&T CGIV 2002 First European Conf. on Colour in Graphics, Imaging, and Vision, 2002, pp 268 - 277, https://doi.org/10.2352/CGIV.2002.1.1.art00059