Generating human faces by generative adversarial network

Style transfer is the process of merging the content of one image with the style of another to create a stylized image. In this work, I first study popular style transfer techniques such as Neural Style Transfer and AdaIN. However, current style transfer techniques do not allow fine-level control...

Full description

Saved in:
Bibliographic Details
Main Author: Tao, Weijing
Other Authors: Chen Change Loy
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/153248
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Style transfer is the process of merging the content of one image with the style of another to create a stylized image. In this work, I first study popular style transfer techniques such as Neural Style Transfer and AdaIN. However, current style transfer techniques do not allow fine-level control of stylized image features. Next, I study the state-of-the-art StyleGAN and the network blending algorithm in details and accomplish style transfer using transfer learning. I provide a total of seven styles for the process of style transfer, available in different image sizes. In particular, I suggest an improved model of Toonification by Justin Pinkney, where realistic human textures can be generated with toonified structural features. In addition, I implement style mixing on Toonification model which allows control over the high-level fine features of the generated toonified images. The refined model can be extended to perform real time arbitrary style transfer where users can easily alter specific features (such as hair colour and glasses) of their toonified images regardless of their input image size. Finally, I conclude with discussions on future improvement directions