Activity sensing using WIFI CSI
Behavior recognition plays a crucial role in various fields such as environmental monitoring, smart healthcare, intelligent furniture, and human-computer interaction, and has been a focal point in academia. Presently, research on WiFi-based human behavior recognition technology represents a novel di...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/175612 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Behavior recognition plays a crucial role in various fields such as environmental monitoring, smart healthcare, intelligent furniture, and human-computer interaction, and has been a focal point in academia. Presently, research on WiFi-based human behavior recognition technology represents a novel direction in this domain. In comparison to traditional methods like sensor-based, image-based, or UWB recognition, WiFi-based approaches overcome drawbacks such as the need for wearables, sensitivity to light, and high recognition costs. Moreover, they exhibit high perceptual sensitivity and recognition accuracy, making them suitable for indoor behavior recognition. Consequently, they promise to revolutionize control methodologies for smart furniture.
This paper proposes a WiFi behavior recognition scheme by collecting Channel State Information (CSI) data, preprocessing it, and extracting actions. The main contributions of this work are as follows:
Through experimental design based on Fresnel zone theory, stable and reliable CSI data were obtained. A series of preprocessing methods were employed, including removing DC components, outlier detection and correction, and signal filtering, to mitigate the impact of random noise in the environment on received data. For the preprocessed data, a method based on moving variance was proposed to determine the start and end of actions using a variable sliding window. Experimental results demonstrate that this action extraction method achieves a balance between high accuracy and efficiency.
In comparison with common behavior recognition methods, an improved CNN network for behavior recognition is proposed. It increases the average accuracy of recognizing 7 behaviors in interference-free environments to 90.74%, with the best action recognition accuracy reaching 99%, while even the worst-performing action recognition accuracy still reaches 83%. |
---|