Analysis and Controlled Synthesis of Inhomogeneous Textures

EUROGRAPHICS 2017

Yang Zhou1;2     Huajie Shi2      Dani Lischinski3      Minglun Gong4      Johannes Kopf5      Hui Huang1;2*

1Shenzhen University        2SIAT         3The Hebrew University of Jerusalem       4Memorial University of Newfoundland       5Facebook


Figure 1: Given an inhomogeneous and anisotropic texture exemplar (a), we automatically extract the corresponding source guidance map (b), comprising a scalar progression channel (rendered in pseudo-color) and a direction field (overlayed in red). A target guidance map (d) may then be used to synthesize a texture for a specific 3D object (c). The final texture-mapped result is shown in (e).


Abstract 

Many interesting real-world textures are inhomogeneous and/or anisotropic. An inhomogeneous texture is one where various visual properties exhibit significant changes across the texture’s spatial domain. Examples include perceptible changes in surface color, lighting, local texture pattern and/or its apparent scale, and weathering effects, which may vary abruptly, or in a continuous fashion. An anisotropic texture is one where the local patterns exhibit a preferred orientation, which also may vary across the spatial domain. While many example-based texture synthesis methods can be highly effective when synthesizing uniform (stationary) isotropic textures, synthesizing highly non-uniform textures, or ones with spatially varying orientation, is a considerably more challenging task, which so far has remained underexplored. In this paper, we propose a new method for automatic analysis and controlled synthesis of such textures. Given an input texture exemplar, our method generates a source guidance map comprising: (i) a scalar progression channel that attempts to capture the low frequency spatial changes in color, lighting, and local pattern combined, and (ii) a direction field that captures the local dominant orientation of the texture. Having augmented the texture exemplar with this guidance map, users can exercise better control over the synthesized result by providing easily specified target guidance maps, which are used to constrain the synthesis process. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism—color, shading, shadowing, and texture I.4.7 [Image Processing and Computer Vision]: Feature Measurement—texture

Figure 2:Simultaneous control of progression and orientation: the top row shows two input exemplars with their corresponding extracted source guidance maps (the scalar channel is overlaid with the detected directions.) The bottom part shows four different target guidance maps (c) that control progression and direction independently. The corresponding synthesized results are shown in (b) and (d). For selfvalidation, we repeat our texture analysis on these results, and the recovered guidance channels are shown in (a) & (e). It may be seen that these recovered channels are indeed qualitatively close to the target guidance. Note that exact recovery should not be expected here, since the target guidance only serves as a soft constraint in our synthesis process.


Figure 3:Controlled texturing of 3D objects with automatically generated target guidance maps. The target scalar progression maps are automatically computed based on geometric features, where the specific features we chose include: height (a), (b) & (h), shape diameter function (c) & (d), normal orientation (e), curvature (f). Automatically generated vector fields are also used in (g) & (h) to control both non-uniform appearance and local orientation.



Reference

[To reference our ALGORITHM, API, CODE or DATA in any publication, please include the bibtex below and a link to this webpage.]

We have released our codes on Github

The code package itself is reliable, but for beginners on texture synthesis, you may need more instructions. We will keep updating the Readme files to make our tool more user-friendly.



Acknowledgments

We thank the anonymous reviewers for their constructive comments, and are grateful to Su Xue and Xin Tong for providing their codes to test. This work was supported in part by NSFC (61602461, 61522213, 61402459), 973 Program (2015CB352501), Guangdong Science and Technology Program (2014TX01X033, 2015A030312015, 2016A050503036), Shenzhen Innovation Program (JCYJ20151015151249564, JCYJ20150630114942295), NSERC (293127) and the Israel Science Foundation.


BibTex

@article {Texsyn17,
    title = {Analysis and Controlled Synthesis of Inhomogeneous Textures},
    author = {Yang Zhou and Huajie Shi and Dani Lischinski and Minglun Gong and Johannes Kopf and Hui Huang},
    journal = {Computer Graphics Forum (Proc. of Eurographics 2017)},
    volume = {36},
    number = {2},
    year = {2017},
}

Copyright © 1998-2018 Visual Computing Research Center