AV-FDTI: audio-visual fusion for drone threat identification

In response to the evolving challenges posed by small unmanned aerial vehicles (UAVs), which have the potential to transport harmful payloads or cause significant damage, we present AV-FDTI, an innovative Audio-Visual Fusion system designed for Drone Threat Identification. AV-FDTI leverages the fusi...

Full description

Saved in:
Bibliographic Details
Main Authors: Yang, Yizhuo, Yuan, Shenghai, Yang, Jianfei, Nguyen, Thien Hoang, Cao, Muqing, Nguyen, Thien-Minh, Wang, Han, Xie, Lihua
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/181381
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:In response to the evolving challenges posed by small unmanned aerial vehicles (UAVs), which have the potential to transport harmful payloads or cause significant damage, we present AV-FDTI, an innovative Audio-Visual Fusion system designed for Drone Threat Identification. AV-FDTI leverages the fusion of audio and omnidirectional camera feature inputs, providing a comprehensive solution to enhance the precision and resilience of drone classification and 3D localization. Specifically, AV-FDTI employs a CRNN network to capture vital temporal dynamics within the audio domain and utilizes a pretrained ResNet50 model for image feature extraction. Furthermore, we adopt a visual information entropy and cross-attention-based mechanism to enhance the fusion of visual and audio data. Notably, our system is trained based on automated Leica tracking annotations, offering accurate ground truth data with millimeter-level accuracy. Comprehensive comparative evaluations demonstrate the superiority of our solution over the existing systems. In our commitment to advancing this field, we will release this work as open-source code and wearable AV-FDTI design, contributing valuable resources to the research community.