An FPGA-based real-time drone detection system
Drone target tracking is currently widely applied in critical sectors such as firefighting and military operations and has become a significant research topic in the field of computer vision. Existing drone target detection algorithms are mostly based on traditional RGB cameras. However, these detec...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/178105 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Drone target tracking is currently widely applied in critical sectors such as firefighting and military operations and has become a significant research topic in the field of computer vision. Existing drone target detection algorithms are mostly based on traditional RGB cameras. However, these detection methods have several shortcomings. This study focuses on drone detection using event-based cameras, which offer advantages over traditional cameras, including low latency, high dynamic range, low power consumption, and high temporal resolution.
The work in this paper mainly consists of two parts. The first is on the software aspect, involving the algorithmic implementation of drone detection. In this section, we adopt a row-scanning method to process the data captured by the event camera line by line. We introduce the concepts of "event slices" and "event lists," comparing the event slices detected in each row with the event list. This comparison facilitates the update of the event list and the box of the drone. This approach significantly reduces the computation and data transmission during drone detection. While not compromising detection accuracy, it shortens the detection time and enhances efficiency.
The second part is focused on hardware implementation, specifically the hardware realization of the drone detection algorithm. In the hardware domain, this paper employs an image processing architecture with 32 parallel processing elements to handle the detected event slices. These processing elements assess the merging of boxes based on their detection outcomes. They determine the box indexes to be merged, update the boundaries of these boxes, and eventually input the drone box data through the UART port to the computer. The boundaries of the boxes are then drawn on the event graph, thus achieving the objective of drone detection.
The performance of the entire system has been tested by amounts of simulations and analyses from design-stage to testing-stage. The experimental results indicate that the system can rapidly detect the position of drones in both single-frame event images and multi-frame event images with high accuracy. |
---|