Spatially-invariant style-codes controlled makeup transfer
Transferring makeup from the misaligned reference image is challenging. Previous methods overcome this barrier by computing pixel-wise correspondences between two images, which is inaccurate and computational-expensive. In this paper, we take a different perspective to break down the makeup transfer...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8526 https://ink.library.smu.edu.sg/context/sis_research/article/9529/viewcontent/Spatially_Invariant_Style_Codes_Controlled_Makeup_Transfer.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-9529 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-95292024-01-22T15:00:37Z Spatially-invariant style-codes controlled makeup transfer DENG, Han HAN, Chu CAI, Hongmin HAN, Guoqiang HE, Shengfeng Transferring makeup from the misaligned reference image is challenging. Previous methods overcome this barrier by computing pixel-wise correspondences between two images, which is inaccurate and computational-expensive. In this paper, we take a different perspective to break down the makeup transfer problem into a two-step extraction-assignment process. To this end, we propose a Style-based Controllable GAN model that consists of three components, each of which corresponds to target style-code encoding, face identity features extraction, and makeup fusion, respectively. In particular, a Part-specific Style Encoder encodes the component-wise makeup style of the reference image into a style-code in an intermediate latent space W. The style-code discards spatial information and therefore is invariant to spatial misalignment. On the other hand, the style-code embeds component-wise information, enabling flexible partial makeup editing from multiple references. This style-code, together with source identity features, is integrated into a Makeup Fusion Decoder equipped with multiple AdaIN layers to generate the final result. Our proposed method demonstrates great flexibility on makeup transfer by supporting makeup removal, shade-controllable makeup transfer, and part-specific makeup transfer, even with large spatial misalignment. Extensive experiments demonstrate the superiority of our approach over state-of-the-art methods. 2021-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8526 info:doi/10.1109/CVPR46437.2021.00648 https://ink.library.smu.edu.sg/context/sis_research/article/9529/viewcontent/Spatially_Invariant_Style_Codes_Controlled_Makeup_Transfer.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Break down Component wise Features extraction Reference image Spatial informations Spatial misalignments Spatially invariants Three-component Transfer problems Two-step extraction Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Break down Component wise Features extraction Reference image Spatial informations Spatial misalignments Spatially invariants Three-component Transfer problems Two-step extraction Databases and Information Systems |
spellingShingle |
Break down Component wise Features extraction Reference image Spatial informations Spatial misalignments Spatially invariants Three-component Transfer problems Two-step extraction Databases and Information Systems DENG, Han HAN, Chu CAI, Hongmin HAN, Guoqiang HE, Shengfeng Spatially-invariant style-codes controlled makeup transfer |
description |
Transferring makeup from the misaligned reference image is challenging. Previous methods overcome this barrier by computing pixel-wise correspondences between two images, which is inaccurate and computational-expensive. In this paper, we take a different perspective to break down the makeup transfer problem into a two-step extraction-assignment process. To this end, we propose a Style-based Controllable GAN model that consists of three components, each of which corresponds to target style-code encoding, face identity features extraction, and makeup fusion, respectively. In particular, a Part-specific Style Encoder encodes the component-wise makeup style of the reference image into a style-code in an intermediate latent space W. The style-code discards spatial information and therefore is invariant to spatial misalignment. On the other hand, the style-code embeds component-wise information, enabling flexible partial makeup editing from multiple references. This style-code, together with source identity features, is integrated into a Makeup Fusion Decoder equipped with multiple AdaIN layers to generate the final result. Our proposed method demonstrates great flexibility on makeup transfer by supporting makeup removal, shade-controllable makeup transfer, and part-specific makeup transfer, even with large spatial misalignment. Extensive experiments demonstrate the superiority of our approach over state-of-the-art methods. |
format |
text |
author |
DENG, Han HAN, Chu CAI, Hongmin HAN, Guoqiang HE, Shengfeng |
author_facet |
DENG, Han HAN, Chu CAI, Hongmin HAN, Guoqiang HE, Shengfeng |
author_sort |
DENG, Han |
title |
Spatially-invariant style-codes controlled makeup transfer |
title_short |
Spatially-invariant style-codes controlled makeup transfer |
title_full |
Spatially-invariant style-codes controlled makeup transfer |
title_fullStr |
Spatially-invariant style-codes controlled makeup transfer |
title_full_unstemmed |
Spatially-invariant style-codes controlled makeup transfer |
title_sort |
spatially-invariant style-codes controlled makeup transfer |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2021 |
url |
https://ink.library.smu.edu.sg/sis_research/8526 https://ink.library.smu.edu.sg/context/sis_research/article/9529/viewcontent/Spatially_Invariant_Style_Codes_Controlled_Makeup_Transfer.pdf |
_version_ |
1789483259037483008 |