Material appearance design usually requires an unintuitive selection of parameters for an analytical BRDF (bidirectional reflectance distribution functions) formula or time consuming acquisition of a BRDF table from a physical material sample. We propose a material design system that can take visual input in the form of images of an object with known geometry and lighting and produce a plausible BRDF table within a class of BRDFs. We use the principal components of a large dataset of BRDFs to reconstruct a full BRDF table from a sparse input of BRDF values at pairs of incoming and outgoing light directions. To get visual user input, we allow the user to provide their own object and then generate guide images with selected lighting directions. Once the user shades the guide images, we construct the reflectance table and allow the user to iteratively refine the material appearance. We present preliminary results for image-based design, and discuss the open issues remaining to make the approach practical for mainstream use.