Human skin is made up of two primary chromophores: melanin, the pigment in the epidermis giving skin its color; and hemoglobin, the pigment in the red blood cells of the vascular network within the dermis. The relative concentrations of these chromophores provide a vital indicator
for skin health and appearance. We present a technique to automatically estimate chromophore maps from RGB images of human faces captured with mobile devices such as smartphones. The ultimate goal is to provide a diagnostic aid for individuals to monitor and improve the quality of their facial
skin. A previous method approaches the problem as one of blind source separation, and applies Independent Component Analysis (ICA) in camera RGB space to estimate the chromophores. We extend this technique in two important ways. First we observe that models for light transport in skin call
for source separation to be performed in log spectral reflectance coordinates rather than in RGB. Thus we transform camera RGB to a spectral reflectance space prior to applying ICA. This process involves the use of a linear camera model and Principal Component Analysis to represent skin spectral
reflectance as a lowdimensional manifold. The camera model requires knowledge of the incident illuminant, which we obtain via a novel technique that uses the human lip as a calibration object. Second, we address an inherent limitation with ICA that the ordering of the separated signals is
random and ambiguous. We incorporate a domain-specific prior model for human chromophore spectra as a constraint in solving ICA. Results on a dataset of mobile camera images show high quality and unambiguous recovery of chromophores.