Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors

Gesture recognition using machine learning methods is valuable in the development of advanced cybernetics, robotics, and healthcare systems, and typically relies on images or videos. To improve recognition accuracy, such visual data can be fused with data from other sensors, but this approach is lim...

全面介紹

Saved in:
書目詳細資料
Main Authors: Wang, Ming, Yan, Zheng, Wang, Ting, Cai, Pingqiang, Gao, Siyu, Zeng, Yi, Wan, Changjin, Wang, Hong, Pan, Liang, Yu, Jiancan, Pan, Shaowu, He, Ke, Lu, Jie, Chen, Xiaodong
其他作者: School of Materials Science and Engineering
格式: Article
語言:English
出版: 2021
主題:
在線閱讀:https://hdl.handle.net/10356/148021
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
id sg-ntu-dr.10356-148021
record_format dspace
spelling sg-ntu-dr.10356-1480212023-07-14T16:01:47Z Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors Wang, Ming Yan, Zheng Wang, Ting Cai, Pingqiang Gao, Siyu Zeng, Yi Wan, Changjin Wang, Hong Pan, Liang Yu, Jiancan Pan, Shaowu He, Ke Lu, Jie Chen, Xiaodong School of Materials Science and Engineering Innovative Centre for Flexible Devices Engineering::Materials Convolutional Neural Networks Network architecture Gesture recognition using machine learning methods is valuable in the development of advanced cybernetics, robotics, and healthcare systems, and typically relies on images or videos. To improve recognition accuracy, such visual data can be fused with data from other sensors, but this approach is limited by the quality of the sensor data and the incompatibility of the datasets. Here, we report a bioinspired data fusion architecture that can perform human gesture recognition by integrating visual data with somatosensory data from skin-like stretchable strain sensors. The learning architecture uses a convolutional neural network for visual processing, and then implements a sparse neural network for sensor data fusion and recognition. Our approach can achieve a recognition accuracy of 100%, and maintain recognition accuracy with noisy, under- or over-exposed images. We also show that our architecture can be implemented for robot navigation using hand gestures with a small error, even in the dark. Ministry of Education (MOE) National Research Foundation (NRF) Accepted version The project was partially supported by the National Research Foundation (NRF), Prime Minister’s office, Singapore, under its NRF Investigatorship (NRF2016NRF-NRF1001-21), Singapore Ministry of Education (MOE2015-T2-2-60), Advanced Manufacturing and Engineering (AME) Programmatic Grant (No. A19A1b0045), and the Australian Research Council (ARC) under Discovery Grant DP200100700. The authors thank all the volunteers for collecting data and also thank Dr. Ai Lin Chun for critically reading and editing the manuscript. 2021-04-26T02:31:10Z 2021-04-26T02:31:10Z 2020 Journal Article Wang, M., Yan, Z., Wang, T., Cai, P., Gao, S., Zeng, Y., Wan, C., Wang, H., Pan, L., Yu, J., Pan, S., He, K., Lu, J. & Chen, X. (2020). Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors. Nature Electronics, 3, 563-570. https://dx.doi.org/10.1038/s41928-020-0422-z 2520-1131 https://hdl.handle.net/10356/148021 10.1038/s41928-020-0422-z 3 563 570 en Nature Electronics © 2020 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. This paper was published in Nature Electronics and is made available with permission of Macmillan Publishers Limited, part of Springer Nature. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Materials
Convolutional Neural Networks
Network architecture
spellingShingle Engineering::Materials
Convolutional Neural Networks
Network architecture
Wang, Ming
Yan, Zheng
Wang, Ting
Cai, Pingqiang
Gao, Siyu
Zeng, Yi
Wan, Changjin
Wang, Hong
Pan, Liang
Yu, Jiancan
Pan, Shaowu
He, Ke
Lu, Jie
Chen, Xiaodong
Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
description Gesture recognition using machine learning methods is valuable in the development of advanced cybernetics, robotics, and healthcare systems, and typically relies on images or videos. To improve recognition accuracy, such visual data can be fused with data from other sensors, but this approach is limited by the quality of the sensor data and the incompatibility of the datasets. Here, we report a bioinspired data fusion architecture that can perform human gesture recognition by integrating visual data with somatosensory data from skin-like stretchable strain sensors. The learning architecture uses a convolutional neural network for visual processing, and then implements a sparse neural network for sensor data fusion and recognition. Our approach can achieve a recognition accuracy of 100%, and maintain recognition accuracy with noisy, under- or over-exposed images. We also show that our architecture can be implemented for robot navigation using hand gestures with a small error, even in the dark.
author2 School of Materials Science and Engineering
author_facet School of Materials Science and Engineering
Wang, Ming
Yan, Zheng
Wang, Ting
Cai, Pingqiang
Gao, Siyu
Zeng, Yi
Wan, Changjin
Wang, Hong
Pan, Liang
Yu, Jiancan
Pan, Shaowu
He, Ke
Lu, Jie
Chen, Xiaodong
format Article
author Wang, Ming
Yan, Zheng
Wang, Ting
Cai, Pingqiang
Gao, Siyu
Zeng, Yi
Wan, Changjin
Wang, Hong
Pan, Liang
Yu, Jiancan
Pan, Shaowu
He, Ke
Lu, Jie
Chen, Xiaodong
author_sort Wang, Ming
title Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
title_short Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
title_full Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
title_fullStr Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
title_full_unstemmed Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
title_sort gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors
publishDate 2021
url https://hdl.handle.net/10356/148021
_version_ 1773551318378479616