In this paper we introduce a new approach to characterizing image quality: visual equivalence. Images are visually equivalent if they convey the same information about object appearance even if they are visibly different. In a series of psychophysical experiments we explore how object geometry, material, and illumination interact to produce images that are visually equivalent, and we identify how two kinds of transformations on illumination fields (blurring and warping) influence observers' judgments of equivalence. We use the results of the experiments to derive metrics that can serve as visual equivalence predictors (VEPs) and we generalize these metrics so they can be applied to novel objects and scenes. Finally we validate the predictors in a confirmatory study, and show that they reliably predict observer's judgments of equivalence. Visual equivalence is a significant new approach to measuring image quality that goes beyond existing visible difference metrics by leveraging the fact that some kinds of image differences do not matter to human observers. By taking advantage of higher order aspects of visual object coding, visual equivalence metrics should enable the development of powerful new classes of image capture, compression, rendering, and display algorithms.
James A. Ferwerda, Ganesh Ramanarayanan, Bruce Walter, Kavita Bala, "Visual Equivalence: an Object-based Approach to Image Quality" in Proc. IS&T 16th Color and Imaging Conf., 2008, pp 347 - 354, https://doi.org/10.2352/CIC.2008.16.1.art00066