Real-time processing constraints entail that non-linear color transforms be implemented using multi-dimensional look-up-tables (LUT). The LUT cannot be prohibitively large because of storage and memory constraints. Hence the LUT is built with a sparser input node sampling. The issue of LUT node placement becomes important in such cases. The standard approach is to place the nodes upon a uniform regular lattice spanning the entire input color space. Such a uniform placement of LUT nodes ignores certain crucial factors namely the curvature of the non-linear color transform, and the statistical distribution of the input – often resulting in high approximation errors or equivalently objectionable visual artifacts in images processed through the LUT. In this paper, we formulate the underlying cost measure that node placement algorithms should seek to minimize. At the heart of this measure is a significance function which quantifies the relative importance of input variables. We argue that techniques which select truly optimal nodes w.r.t this cost measure are computationally infeasible. We then propose an efficient algorithm that is essentially based on selecting nodes that lie at maxima of the significance function. The latter is iteratively adjusted to prevent degenerate node selection. Experimental results across a variety of scenarios demonstrate significant transform accuracy improvements over classical uniform node-spacing.
Vishal Monga, Raja Bala, "Sort-Select-Damp: An efficient strategy for color look-up table lattice design" in Proc. IS&T 16th Color and Imaging Conf., 2008, pp 247 - 253, https://doi.org/10.2352/CIC.2008.16.1.art00047