Touchless human-computer interface (HCI) with 60 GHZ radar sensor and machine learning

The Singaporean government has declared the end of the acute phase of the COVID-19 pandemic, but the lessons learned from the pandemic serve as a reminder and warning. Contactless control technology has become crucial due to the threat of contact-borne viruses, and the adoption of various signal rec...

Full description

Saved in:
Bibliographic Details
Main Author: Xian, Wei
Other Authors: Lu Yilong
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/165019
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The Singaporean government has declared the end of the acute phase of the COVID-19 pandemic, but the lessons learned from the pandemic serve as a reminder and warning. Contactless control technology has become crucial due to the threat of contact-borne viruses, and the adoption of various signal receiving devices is essential for meeting different requirements. While voice control technology has gained significant popularity in recent years, gesture control using radar is also emerging as a powerful tool for remote control applications. By using radar to track hand movements, gesture control offers users a more intuitive and precise way to interact with their devices without the need for physical touch. This technology has several advantages, including its ability to operate in low-light environments and its immunity to external noise sources. Additionally, gesture control offers a more natural and ergonomic way of interacting with devices, particularly in situations where voice control may be impractical or disruptive. The primary objective of this project is to develop a gesture sensing system that utilizes millimeter wave radar technology, coupled with machine learning and data processing techniques. Through a rigorous testing and selection process, four specific gestures have been chosen as the primary experimental targets: holding, pushing, swiping, and waving. These gestures have been determined to be relatively distinct and easily identifiable based on the data extracted from the system. The project is structured into two distinct phases. The first phase involves establishing a connection between the radar and computer to enable data storage and transmission. The second phase focuses on applying machine learning algorithms to process and analyze the collected data, culminating in a demonstration of the system's performance in different scenarios.