Human activities recognition in smart living environment
In recent times, smartphones, outfitted with advanced sensors including gyroscopes and accelerometers, have become adept at capturing the nuanced movements of users from various perspectives. This wealth of data paves the way for Human Activity Recognition (HAR) within smart living environments, off...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/176894 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | In recent times, smartphones, outfitted with advanced sensors including gyroscopes and accelerometers, have become adept at capturing the nuanced movements of users from various perspectives. This wealth of data paves the way for Human Activity Recognition (HAR) within smart living environments, offering significant benefits for enhancing the quality of daily life and promoting physical health. Leveraging a range of machine learning methodologies, the raw sensor data can be transformed into detailed insights regarding user activities and locations.
This project is dedicated to developing a comprehensive information system capable of analyzing both simple and complex activities performed by individuals in smart living spaces. By employing diverse machine learning techniques, it endeavors to classify and accurately recognize various human activities. Among the plethora of deep learning strategies, Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks stand out. This project innovates by creating a hybrid CNN-LSTM model, aimed at improving the efficiency of HAR. Results have been promising, with the model demonstrating an impressive accuracy rate of up to 92.47% and maintaining a relatively low loss of 0.4007. |
---|