Unlocking the capabilities of explainable few‑shot learning in remote sensing
Recent advancements have significantly improved the efficiency and effectiveness of deep learning methods for image-based remote sensing tasks. However, the requirement for large amounts of labeled data can limit the applicability of deep neural networks to existing remote sensing datasets. To ov...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/178366 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-178366 |
---|---|
record_format |
dspace |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Computer and Information Science Engineering Deep learning Explainable Artificial Intelligence (XAI) Few-shot learning Remote sensing Unmanned Aerial Vehicles (UAVs) |
spellingShingle |
Computer and Information Science Engineering Deep learning Explainable Artificial Intelligence (XAI) Few-shot learning Remote sensing Unmanned Aerial Vehicles (UAVs) Lee, Gao Yu Dam, Tanmoy Md Meftahul Ferdaus Poenar, Daniel Puiu Duong, Vu N. Unlocking the capabilities of explainable few‑shot learning in remote sensing |
description |
Recent advancements have significantly improved the efficiency and effectiveness of deep
learning methods for image-based remote sensing tasks. However, the requirement for
large amounts of labeled data can limit the applicability of deep neural networks to existing
remote sensing datasets. To overcome this challenge, few-shot learning has emerged as
a valuable approach for enabling learning with limited data. While previous research has
evaluated the effectiveness of few-shot learning methods on satellite-based datasets, little
attention has been paid to exploring the applications of these methods to datasets obtained
from Unmanned Aerial Vehicles (UAVs), which are increasingly used in remote sensing
studies. In this review, we provide an up-to-date overview of both existing and newly proposed
few-shot classification techniques, along with appropriate datasets that are used for
both satellite-based and UAV-based data. We demonstrate few-shot learning can effectively
handle the diverse perspectives in remote sensing data. As an example application, we
evaluate state-of-the-art approaches on a UAV disaster scene dataset, yielding promising
results. Furthermore, we highlight the significance of incorporating explainable AI (XAI)
techniques into few-shot models. In remote sensing, where decisions based on model predictions
can have significant consequences, such as in natural disaster response or environmental
monitoring, the transparency provided by XAI is crucial. Techniques like attention
maps and prototype analysis can help clarify the decision-making processes of these complex
models, enhancing their reliability. We identify key challenges including developing
flexible few-shot methods to handle diverse remote sensing data effectively. This review
aims to equip researchers with an improved understanding of few-shot learning’s capabilities
and limitations in remote sensing, while pointing out open issues to guide progress in
efficient, reliable and interpretable data-efficient techniques. |
author2 |
School of Mechanical and Aerospace Engineering |
author_facet |
School of Mechanical and Aerospace Engineering Lee, Gao Yu Dam, Tanmoy Md Meftahul Ferdaus Poenar, Daniel Puiu Duong, Vu N. |
format |
Article |
author |
Lee, Gao Yu Dam, Tanmoy Md Meftahul Ferdaus Poenar, Daniel Puiu Duong, Vu N. |
author_sort |
Lee, Gao Yu |
title |
Unlocking the capabilities of explainable few‑shot learning in remote sensing |
title_short |
Unlocking the capabilities of explainable few‑shot learning in remote sensing |
title_full |
Unlocking the capabilities of explainable few‑shot learning in remote sensing |
title_fullStr |
Unlocking the capabilities of explainable few‑shot learning in remote sensing |
title_full_unstemmed |
Unlocking the capabilities of explainable few‑shot learning in remote sensing |
title_sort |
unlocking the capabilities of explainable few‑shot learning in remote sensing |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/178366 |
_version_ |
1806059830306144256 |
spelling |
sg-ntu-dr.10356-1783662024-06-18T15:31:13Z Unlocking the capabilities of explainable few‑shot learning in remote sensing Lee, Gao Yu Dam, Tanmoy Md Meftahul Ferdaus Poenar, Daniel Puiu Duong, Vu N. School of Mechanical and Aerospace Engineering School of Electrical and Electronic Engineering Air Traffic Management Research Institute Computer and Information Science Engineering Deep learning Explainable Artificial Intelligence (XAI) Few-shot learning Remote sensing Unmanned Aerial Vehicles (UAVs) Recent advancements have significantly improved the efficiency and effectiveness of deep learning methods for image-based remote sensing tasks. However, the requirement for large amounts of labeled data can limit the applicability of deep neural networks to existing remote sensing datasets. To overcome this challenge, few-shot learning has emerged as a valuable approach for enabling learning with limited data. While previous research has evaluated the effectiveness of few-shot learning methods on satellite-based datasets, little attention has been paid to exploring the applications of these methods to datasets obtained from Unmanned Aerial Vehicles (UAVs), which are increasingly used in remote sensing studies. In this review, we provide an up-to-date overview of both existing and newly proposed few-shot classification techniques, along with appropriate datasets that are used for both satellite-based and UAV-based data. We demonstrate few-shot learning can effectively handle the diverse perspectives in remote sensing data. As an example application, we evaluate state-of-the-art approaches on a UAV disaster scene dataset, yielding promising results. Furthermore, we highlight the significance of incorporating explainable AI (XAI) techniques into few-shot models. In remote sensing, where decisions based on model predictions can have significant consequences, such as in natural disaster response or environmental monitoring, the transparency provided by XAI is crucial. Techniques like attention maps and prototype analysis can help clarify the decision-making processes of these complex models, enhancing their reliability. We identify key challenges including developing flexible few-shot methods to handle diverse remote sensing data effectively. This review aims to equip researchers with an improved understanding of few-shot learning’s capabilities and limitations in remote sensing, while pointing out open issues to guide progress in efficient, reliable and interpretable data-efficient techniques. Civil Aviation Authority of Singapore (CAAS) Nanyang Technological University Published version This research/project is supported by the Civil Aviation Authority of Singapore and Nanyang Technological University, Singapore under their collaboration in the Air Traffic Management Research Institute. 2024-06-13T23:58:59Z 2024-06-13T23:58:59Z 2024 Journal Article Lee, G. Y., Dam, T., Md Meftahul Ferdaus, Poenar, D. P. & Duong, V. N. (2024). Unlocking the capabilities of explainable few‑shot learning in remote sensing. Artificial Intelligence Review, 57, 169-. https://dx.doi.org/10.1007/s10462-024-10803-5 0269-2821 https://hdl.handle.net/10356/178366 10.1007/s10462-024-10803-5 57 169 en Artificial Intelligence Review © 2024 The Author(s). This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. application/pdf |