Estimating skin color from an uncontrolled facial image is a challenging task. Many factors such as illumination, camera and shading variations directly affect the appearance of skin color in the image. Furthermore, using a color calibration target in order to correct the image pixels
leads to a complex user experience. We propose a skin color estimation method from images in the wild, taken with unknown camera, under an unknown lighting, and without a calibration target. While prior methods relied on explicit intermediate steps of color correction of image pixels and skin
region segmentation, we propose an end-to-end color regression model named LabNet, in which color correction and skin region segmentation are implicitly learnt by the model. Our method is based on a convolutional neural network trained on a dataset of smartphone images, labeled with L*a*b*
measures of skin colors. We compare our method with standard skin color estimation approaches and found that our method over-perform these models while removing the need of color calibration target.