Lightness Illusions (Contrast, Assimilation, and Natural Scenes with Edges and Gradients) show that Lightness appearances do not correlate with the light sent from the scene to the eye. Illusions modify “the-rest-of-the-scene” to make two identical-luminance Gray segments appear different from each other. Scene segments have two properties in human vision: apparent Lightness, and apparent Uniformity. Models of vision have two scene-dependent processes that spatially transform scene luminances. The first is optical veiling glare that modifies the sharpness of the edges, and replaces uniform scene segments with low-slope gradients. The second scene-dependent transformation is neural spatial processing. This means that this spatial transformation has many tasks to perform in generating appearances. They include: making edges appear sharp; making gradients in scene segments appear uniform; and compensating for glare’s many local redistributions of light. In short, neural spatial processing does an excellent job of ignoring glare’s distortions of scene luminance. In fact it over compensates glare in a way that generates appearances reported in Contrast Illusions, B&W Mondrians, and Checkershadow Illusions.