Human activities recognition in smart living environment
In recent times, smartphones, outfitted with advanced sensors including gyroscopes and accelerometers, have become adept at capturing the nuanced movements of users from various perspectives. This wealth of data paves the way for Human Activity Recognition (HAR) within smart living environments, off...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/176894 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-176894 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1768942024-05-24T15:45:03Z Human activities recognition in smart living environment Yao, Hengji Soh Yeng Chai School of Electrical and Electronic Engineering EYCSOH@ntu.edu.sg Engineering Human activities recognition Deep learning In recent times, smartphones, outfitted with advanced sensors including gyroscopes and accelerometers, have become adept at capturing the nuanced movements of users from various perspectives. This wealth of data paves the way for Human Activity Recognition (HAR) within smart living environments, offering significant benefits for enhancing the quality of daily life and promoting physical health. Leveraging a range of machine learning methodologies, the raw sensor data can be transformed into detailed insights regarding user activities and locations. This project is dedicated to developing a comprehensive information system capable of analyzing both simple and complex activities performed by individuals in smart living spaces. By employing diverse machine learning techniques, it endeavors to classify and accurately recognize various human activities. Among the plethora of deep learning strategies, Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks stand out. This project innovates by creating a hybrid CNN-LSTM model, aimed at improving the efficiency of HAR. Results have been promising, with the model demonstrating an impressive accuracy rate of up to 92.47% and maintaining a relatively low loss of 0.4007. Bachelor's degree 2024-05-21T04:30:02Z 2024-05-21T04:30:02Z 2024 Final Year Project (FYP) Yao, H. (2024). Human activities recognition in smart living environment. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/176894 https://hdl.handle.net/10356/176894 en A1091-231 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering Human activities recognition Deep learning |
spellingShingle |
Engineering Human activities recognition Deep learning Yao, Hengji Human activities recognition in smart living environment |
description |
In recent times, smartphones, outfitted with advanced sensors including gyroscopes and accelerometers, have become adept at capturing the nuanced movements of users from various perspectives. This wealth of data paves the way for Human Activity Recognition (HAR) within smart living environments, offering significant benefits for enhancing the quality of daily life and promoting physical health. Leveraging a range of machine learning methodologies, the raw sensor data can be transformed into detailed insights regarding user activities and locations.
This project is dedicated to developing a comprehensive information system capable of analyzing both simple and complex activities performed by individuals in smart living spaces. By employing diverse machine learning techniques, it endeavors to classify and accurately recognize various human activities. Among the plethora of deep learning strategies, Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks stand out. This project innovates by creating a hybrid CNN-LSTM model, aimed at improving the efficiency of HAR. Results have been promising, with the model demonstrating an impressive accuracy rate of up to 92.47% and maintaining a relatively low loss of 0.4007. |
author2 |
Soh Yeng Chai |
author_facet |
Soh Yeng Chai Yao, Hengji |
format |
Final Year Project |
author |
Yao, Hengji |
author_sort |
Yao, Hengji |
title |
Human activities recognition in smart living environment |
title_short |
Human activities recognition in smart living environment |
title_full |
Human activities recognition in smart living environment |
title_fullStr |
Human activities recognition in smart living environment |
title_full_unstemmed |
Human activities recognition in smart living environment |
title_sort |
human activities recognition in smart living environment |
publisher |
Nanyang Technological University |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/176894 |
_version_ |
1814047313787617280 |