Given a single reference stimulus, test stimuli can be sorted with respect to perceptual similarity to this anchor stimulus. Aggregated ranks can then be computed from multiple sort sequences. This ordinal scaling provides an estimate of perceptible differences and can be used to develop and test predictive models. In this paper we propose the use of graph-based methods visualizing experimental data and computing aggregated ranks. Specifically, perceptual similarity is expressed as a sort sequence graph in which nodes are stimuli and weighted edges are the frequency of the corresponding ranks. This graph is also oriented in that it has a start, the reference stimuli, and an end, the least similar stimuli. The Schulze method or the 'strongest path' computation is used for rank aggregation. This analysis is explored in the context of two appearance experiments: the first using solid colors and the second using renderings of 3D printed stimuli varying in multiple appearance attributes. For the second experiment with the renderings of 3D printed stimuli we then use Kendall Tb values to assess a simple model based on mean CIELAB color differences. We find that the underlying sorting task is efficient and intuitive. Furthermore, the graph-based formulation of perceptual similarity allows the application of network analysis and graph theory to the study of visual appearance. New analyses are also possible, such as outlier detection using the sort sequences that are the inverse of the Schulze solution or approximately the 'wrongest path'.
N. Moroney, I. Tastl, M. Gottwals, M. Ludwig, G. Meyer, "Single Anchor Sorting of Visual Appearance as an Oriented Graph" in Proc. IS&T 26th Color and Imaging Conf., 2018, pp 365 - 370, https://doi.org/10.2352/ISSN.2169-2629.2018.26.365