Deep generative model for remote sensing
The practice of identifying and monitoring an area's physical features by detecting its reflected and transmitted radiation from a distance is known as remote sensing (typically from satellite or aircraft). Researchers can "sense" characteristics about the Earth by using special camer...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/158052 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-158052 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1580522023-07-07T19:27:39Z Deep generative model for remote sensing Kok, Melvin Xinwei Wen Bihan School of Electrical and Electronic Engineering bihan.wen@ntu.edu.sg Engineering::Electrical and electronic engineering The practice of identifying and monitoring an area's physical features by detecting its reflected and transmitted radiation from a distance is known as remote sensing (typically from satellite or aircraft). Researchers can "sense" characteristics about the Earth by using special cameras to acquire remotely sensed imagery. While the concept of satellite imagery would bring typical Electro-Optical (EO) Red Green Blue (RGB) images to mind, in remote sensing, there are many other important computational imaging systems such as synthetic aperture radar imaging (SAR), multispectral image fusion, as well as infra-red imaging. These non-EO-RGB imaging systems all have their unique advantages and properties, but the most common imaging system is SAR imaging, which is the focus of this project. SAR imaging is particularly useful due to its ability to always captures image of the Earth’s surface, regardless of day and night, and regardless of weather condition. This is in contrast to the EO imagery, whose quality is subject to changes in illumination from the sun and also changes in weather conditions like cloud cover. However, unlike EO images which have good availability due to large commercial projects (e.g., Google Maps), SAR image data is more scarce and more expensive to obtain. In the project, image-to-image translation is performed on EO images to transfer them to the SAR domain to provide more data for machine learning models on learn on SAR datasets. To transfer large-scale optical RGB images to the desired imaging modality i.e., SAR, a series of image-to-image translation techniques based on GANs were tested. These techniques include Pix2Pix and CycleGAN. Through testing, it was determined that using GANs to perform image-to-image translation on satellite imagery was possible but requires refining to capture all the features from the source domain adequately. Bachelor of Engineering (Electrical and Electronic Engineering) 2022-05-26T06:09:02Z 2022-05-26T06:09:02Z 2022 Final Year Project (FYP) Kok, M. X. (2022). Deep generative model for remote sensing. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/158052 https://hdl.handle.net/10356/158052 en application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering |
spellingShingle |
Engineering::Electrical and electronic engineering Kok, Melvin Xinwei Deep generative model for remote sensing |
description |
The practice of identifying and monitoring an area's physical features by detecting its reflected and transmitted radiation from a distance is known as remote sensing (typically from satellite or aircraft). Researchers can "sense" characteristics about the Earth by using special cameras to acquire remotely sensed imagery. While the concept of satellite imagery would bring typical Electro-Optical (EO) Red Green Blue (RGB) images to mind, in remote sensing, there are many other important computational imaging systems such as synthetic aperture radar imaging (SAR), multispectral image fusion, as well as infra-red imaging. These non-EO-RGB imaging systems all have their unique advantages and properties, but the most common imaging system is SAR imaging, which is the focus of this project. SAR imaging is particularly useful due to its ability to always captures image of the Earth’s surface, regardless of day and night, and regardless of weather condition. This is in contrast to the EO imagery, whose quality is subject to changes in illumination from the sun and also changes in weather conditions like cloud cover. However, unlike EO images which have good availability due to large commercial projects (e.g., Google Maps), SAR image data is more scarce and more expensive to obtain.
In the project, image-to-image translation is performed on EO images to transfer them to the SAR domain to provide more data for machine learning models on learn on SAR datasets. To transfer large-scale optical RGB images to the desired imaging modality i.e., SAR, a series of image-to-image translation techniques based on GANs were tested. These techniques include Pix2Pix and CycleGAN. Through testing, it was determined that using GANs to perform image-to-image translation on satellite imagery was possible but requires refining to capture all the features from the source domain adequately. |
author2 |
Wen Bihan |
author_facet |
Wen Bihan Kok, Melvin Xinwei |
format |
Final Year Project |
author |
Kok, Melvin Xinwei |
author_sort |
Kok, Melvin Xinwei |
title |
Deep generative model for remote sensing |
title_short |
Deep generative model for remote sensing |
title_full |
Deep generative model for remote sensing |
title_fullStr |
Deep generative model for remote sensing |
title_full_unstemmed |
Deep generative model for remote sensing |
title_sort |
deep generative model for remote sensing |
publisher |
Nanyang Technological University |
publishDate |
2022 |
url |
https://hdl.handle.net/10356/158052 |
_version_ |
1772829063102070784 |