We propose a novel tool for re-rendering objects in indoor scene images with new textures. It aims to address the problem of too much manual work of positioning and alignment when applying new texture onto an object surface in an indoor scene image. The algorithm of the tool is based on establishing 2D projective transformation between texture images and planar object surfaces in scene images. In order to find the transformation, we use a sampled rectangular texture pattern from a large synthesized planar texture and a planar quadrangle corresponding to object surface orientation estimation, which is generated by a geometric orientation hypothesis framework. The tool also puts effort in adjusting the scaling and reducing artifacts for re-rendered textures. We present the re-rendering results for ceilings, walls, floors, etc. that naturally correspond to room geometry layout.
Tongyang Liu, Chun-Jung Tai, Fengqing Zhu, Judy Bagchi, Jan P. Allebach, "Texture re-rendering tool for re-mixing indoor scene images" in Proc. IS&T Int’l. Symp. on Electronic Imaging: Imaging and Multimedia Analytics in a Web and Mobile World, 2017, pp 86 - 92, https://doi.org/10.2352/ISSN.2470-1173.2017.10.IMAWM-177