Statistical Invariance for Texture Synthesis

Estimating illumination and deformation fields on textures is essential for both analysis and application purposes. Traditional methods for such estimation usually require complicated and sometimes labor-intensive processing. In this paper, we propose a new perspective for this problem and suggest a...

Full description

Saved in:
Bibliographic Details
Main Authors: Liu, Xiaopei., Jiang, Lei., Wong, Tien-Tsin., Fu, Chi-Wing.
Other Authors: School of Computer Engineering
Format: Article
Language:English
Published: 2013
Subjects:
Online Access:https://hdl.handle.net/10356/100819
http://hdl.handle.net/10220/16492
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-100819
record_format dspace
spelling sg-ntu-dr.10356-1008192020-05-28T07:19:15Z Statistical Invariance for Texture Synthesis Liu, Xiaopei. Jiang, Lei. Wong, Tien-Tsin. Fu, Chi-Wing. School of Computer Engineering Game Lab DRNTU::Engineering::Computer science and engineering Estimating illumination and deformation fields on textures is essential for both analysis and application purposes. Traditional methods for such estimation usually require complicated and sometimes labor-intensive processing. In this paper, we propose a new perspective for this problem and suggest a novel statistical approach which is much simpler and more efficient. Our experiments show that many textures in daily life are statistically invariant in terms of colors and gradients. Variations of such statistics can be assumed to be influenced by illumination and deformation. This implies that we can inversely estimate the spatially varying illumination and deformation according to the variation of the texture statistics. This enables us to decompose a texture photo into an illumination field, a deformation field, and an implicit texture which are illumination- and deformation-free, within a short period of time, and with minimal user input. By processing and recombining these components, a variety of synthesis effects, such as exemplar preparation, texture replacement, surface relighting, as well as geometry modification, can be well achieved. Finally, convincing results are shown to demonstrate the effectiveness of the proposed method. 2013-10-14T08:38:08Z 2019-12-06T20:28:54Z 2013-10-14T08:38:08Z 2019-12-06T20:28:54Z 2012 2012 Journal Article Liu, X., Jiang, L., Wong, T., & Fu, C. (2012) Statistical invariance for texture synthesis. IEEE transactions on visualization and computer graphics, 18(11), 1836-1848. 1077-2626 https://hdl.handle.net/10356/100819 http://hdl.handle.net/10220/16492 10.1109/TVCG.2012.75 en IEEE Transactions on Visualization and Computer Graphics © 2012 IEEE
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic DRNTU::Engineering::Computer science and engineering
spellingShingle DRNTU::Engineering::Computer science and engineering
Liu, Xiaopei.
Jiang, Lei.
Wong, Tien-Tsin.
Fu, Chi-Wing.
Statistical Invariance for Texture Synthesis
description Estimating illumination and deformation fields on textures is essential for both analysis and application purposes. Traditional methods for such estimation usually require complicated and sometimes labor-intensive processing. In this paper, we propose a new perspective for this problem and suggest a novel statistical approach which is much simpler and more efficient. Our experiments show that many textures in daily life are statistically invariant in terms of colors and gradients. Variations of such statistics can be assumed to be influenced by illumination and deformation. This implies that we can inversely estimate the spatially varying illumination and deformation according to the variation of the texture statistics. This enables us to decompose a texture photo into an illumination field, a deformation field, and an implicit texture which are illumination- and deformation-free, within a short period of time, and with minimal user input. By processing and recombining these components, a variety of synthesis effects, such as exemplar preparation, texture replacement, surface relighting, as well as geometry modification, can be well achieved. Finally, convincing results are shown to demonstrate the effectiveness of the proposed method.
author2 School of Computer Engineering
author_facet School of Computer Engineering
Liu, Xiaopei.
Jiang, Lei.
Wong, Tien-Tsin.
Fu, Chi-Wing.
format Article
author Liu, Xiaopei.
Jiang, Lei.
Wong, Tien-Tsin.
Fu, Chi-Wing.
author_sort Liu, Xiaopei.
title Statistical Invariance for Texture Synthesis
title_short Statistical Invariance for Texture Synthesis
title_full Statistical Invariance for Texture Synthesis
title_fullStr Statistical Invariance for Texture Synthesis
title_full_unstemmed Statistical Invariance for Texture Synthesis
title_sort statistical invariance for texture synthesis
publishDate 2013
url https://hdl.handle.net/10356/100819
http://hdl.handle.net/10220/16492
_version_ 1681057797048893440