In this paper we present a method of texture synthesis which removes the need for users to set, or even understand, parameters which have an impact on the synthesized output. We accomplish this by first classifying each input texture sample into one of three texture types: regular, irregular and stochastic. We found that textures within a class were synthesized well with similar parameters. If we know the input texture class, we can provide a good starting set of parameters for the synthesis algorithm. Instead of requiring a user to manually select a set of parameters, we simply ask that the user tell us whether the synthesized texture is satisfactory or not. If the output is not satisfactory, we adjust parameters and try again until the user is happy with the output. In this implementation we use the image quilting method in , a texture synthesis algorithm, as well as texture classification. With small adjustments our method can be applied to other texture synthesis methods.
Kyle Ziga, Judy Bagchi, Jan P. Allebach, Fengqing Zhu, "Non-Parametric Texture Synthesis Using Texture Classification" in Proc. IS&T Int’l. Symp. on Electronic Imaging: Computational Imaging XV, 2017, pp 136 - 141, https://doi.org/10.2352/ISSN.2470-1173.2017.17.COIMG-436