Content created in High Dynamic Range (HDR) and Wide Color Gamut (WCG) is becoming more ubiquitous, driving the need for reliable tools for evaluating the quality across the imaging ecosystem. One of the simplest techniques to measure the quality of any video system is to measure the color errors. The traditional color difference metrics such as ΔE00 and the newer HDR specific metrics such as ΔEZ and ΔEITP compute color difference on a pixel-by-pixel basis which do not account for the spatial effects (optical) and active processing (neural) done by the human visual system. In this work, we improve upon the per-pixel ΔEITP color difference metric by performing a spatial extension similar to what was done during the design of S-CIELAB. We quantified the performance using four standard evaluation procedures on four publicly available HDR and WCG image databases and found that the proposed metric results in a marked improvement with subjective scores over existing per-pixel color difference metrics.
There are an increasing number of databases describing subjective quality responses for HDR (high dynamic range) imagery with various distortions. The dominant distortions across the databases are those that arise from video compression, which are primarily perceived as achromatic, but there are some chromatic distortions due to 422 and other chromatic sub-sampling. Tone mapping from the source HDR levels to various levels of reduced capability SDR (standard dynamic range) are also included in these databases. While most of these distortions are achromatic, tone-mapping can cause changes in saturation and hue angle when saturated colors are in the upper hull of the of the color space. In addition, there is one database that specifically looked at color distortions in an HDR-WCG (wide color gamut) space. From these databases we can test the improvements to well-known quality metrics if they are applied in the newly developed color perceptual spaces (i.e., representations) specifically designed for HDR and WCG. We present results from testing these subjective quality databases to computed quality using the new color spaces of Jzazbz and ICTCP, as well as the commonly used SDR color space of CIELAB.
A gamut compression algorithm (GCA) and a gamut extension algorithm (GEA) were proposed based on the concept of vividness. Their performance was further investigated via two psychological experiments together with some other commonly used gamut mapping algorithms (GMAs). In addition, difference uniform colour spaces (UCSs) were also evaluated in the experiments including CIELAB, CAM02-UCS and a newly proposed UCS, Jzazbz. Present results showed that the new GCA and GEA outperformed all the other GMAs and the Jzazbz was a promising UCS in the field of gamut mapping.