Perceptual quality assessment of color images using adaptive signal representation

U Rajashekar, Z Wang and E P Simoncelli

Published in Proc SPIE Conf on Human Vision and Electronic Imaging, XV, vol.7527 Jan 2010.

DOI: DOI: 10.1117/12.845312

Download:

  • Reprint (pdf)
  • Official (pdf)

  • Perceptual image distortion measures can play a fundamental role in evaluating and optimizing imaging systems and image processing algorithms. Many existing measures are formulated to represent ``just noticeable differences'' (JNDs), as measured in psychophysical experiments on human subjects. But some image distortions, such as those arising from small changes in the intensity of the ambient illumination, are far more tolerable to human observers than those that disrupt the spatial structure of intensities and colors. Here, we introduce a framework in which we quantify these perceptual distortions in terms of ``just intolerable differences'' (JIDs). We first construct a set of spatio-chromatic basis functions to approximate (as a first-order Taylor series) a set of ``non-structural'' distortions that result from changes in lighting/imaging/viewing conditions. These basis functions are defined on local image patches, and are adaptive, in that they are computed as functions of the undistorted reference image. This set is then augmented with a complete basis arising from a linear approximation of the CIELAB color space. Each basis function is weighted by a scale factor to convert it into units corresponding to JIDs. Each patch of the error image is represented using this weighted overcomplete basis, and the overall distortion metric is computed by summing the squared coefficients over all such (overlapping) patches. We implement an example of this metric, incorporating invariance to small changes in the viewing and lighting conditions, and demonstrate that the resulting distortion values are more consistent with human perception than those produced by CIELAB or S-CIELAB.
  • Listing of all publications