Style transfer between different illumination, weather and seasonal conditions

The autonomous mobile robot is the key direction of robot research, while visual localization is the core of autonomous robot research. The bias caused by different illumination, weather, and seasonal conditions may undermine the robot perception and lead to imprecise localization results, while...

Full description

Saved in:
Bibliographic Details
Main Author: Zhu, Fangzheng
Other Authors: Wang Dan Wei
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/155514
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The autonomous mobile robot is the key direction of robot research, while visual localization is the core of autonomous robot research. The bias caused by different illumination, weather, and seasonal conditions may undermine the robot perception and lead to imprecise localization results, while style transfer is an effective solution to it. A recent class of style transfer models allows a realistic translation of images between visual domains with comparatively little training data and without data pairing. In this work, I research methods for style transfer based on Generative Adversarial Network (GAN) and apply them to image retrieval and visual localization. I implement the ToDayGAN model, which can transfer the style of images between different illumination, weather and seasonal conditions. After researching the state-of-the-art visual localization methods on the effect of changing conditions, I apply the style transfer model to implement hierarchical localization, and use SuperPoint to export the dense local descriptors and NetVLAD to export global image-wide descriptors, finally, the SolvePnPRansac pose estimation algorithm is used to obtain a more accurate 6- DoF pose. This approach improves localization performance compared to the current visual localization methods in a framework with several types of standard metrics, which means applying style transfer methods to the task of visual localization is very effective across the contrasting visual conditions.