Configuration-adaptive wireless visual sensing system with deep reinforcement learning
Visual sensing has been increasingly employed in various industrial applications including manufacturing process monitoring and worker safety monitoring. This paper presents the design and implementation of a wireless camera system, namely, EFCam, which uses low-power wireless communications and edg...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/171724 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-171724 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1717242023-11-06T05:18:11Z Configuration-adaptive wireless visual sensing system with deep reinforcement learning Zhou, Siyuan Le, Duc Van Tan, Rui Yang, Joy Qiping Ho, Daren School of Computer Science and Engineering HP-NTU Digital Manufacturing Corporate Lab Engineering::Computer science and engineering Wireless Visual Sensing Fog Computing Visual sensing has been increasingly employed in various industrial applications including manufacturing process monitoring and worker safety monitoring. This paper presents the design and implementation of a wireless camera system, namely, EFCam, which uses low-power wireless communications and edge-fog computing to achieve cordless and energy-efficient visual sensing. The camera performs image pre-processing and offloads the data to a resourceful fog node for advanced processing using deep models. EFCam admits dynamic configurations of several parameters that form a configuration space. It aims to adapt the configuration to maintain desired visual sensing performance of the deep model at the fog node with minimum energy consumption of the camera in image capture, pre-processing, and data communications, under dynamic variations of the monitored process, the application requirement, and wireless channel conditions. However, the adaptation is challenging due to the complex relationships among the involved factors. To address the complexity, we apply deep reinforcement learning to learn the optimal adaptation policy when a fog node supports one or more wireless cameras. Extensive evaluation based on trace-driven simulations and experiments show that EFCam complies with the accuracy and latency requirements with lower energy consumption for a real industrial product object tracking application, compared with five baseline approaches incorporating hysteresis-based and event-triggered adaptation. This work was supported in part by Industry Alignment Fund – Industry Collaboration Projects (IAF-ICP) Funding Initiative, under Grant RIE2020 and in part by cash and in-kind contribution from the industry partner, HP Inc., through the HP-NTU Digital Manufacturing Corporate Lab. 2023-11-06T05:18:11Z 2023-11-06T05:18:11Z 2023 Journal Article Zhou, S., Le, D. V., Tan, R., Yang, J. Q. & Ho, D. (2023). Configuration-adaptive wireless visual sensing system with deep reinforcement learning. IEEE Transactions On Mobile Computing, 22(9), 5078-5091. https://dx.doi.org/10.1109/TMC.2022.3175182 1536-1233 https://hdl.handle.net/10356/171724 10.1109/TMC.2022.3175182 2-s2.0-85130469495 9 22 5078 5091 en IAF-ICP IEEE Transactions on Mobile Computing © 2022 IEEE. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering Wireless Visual Sensing Fog Computing |
spellingShingle |
Engineering::Computer science and engineering Wireless Visual Sensing Fog Computing Zhou, Siyuan Le, Duc Van Tan, Rui Yang, Joy Qiping Ho, Daren Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
description |
Visual sensing has been increasingly employed in various industrial applications including manufacturing process monitoring and worker safety monitoring. This paper presents the design and implementation of a wireless camera system, namely, EFCam, which uses low-power wireless communications and edge-fog computing to achieve cordless and energy-efficient visual sensing. The camera performs image pre-processing and offloads the data to a resourceful fog node for advanced processing using deep models. EFCam admits dynamic configurations of several parameters that form a configuration space. It aims to adapt the configuration to maintain desired visual sensing performance of the deep model at the fog node with minimum energy consumption of the camera in image capture, pre-processing, and data communications, under dynamic variations of the monitored process, the application requirement, and wireless channel conditions. However, the adaptation is challenging due to the complex relationships among the involved factors. To address the complexity, we apply deep reinforcement learning to learn the optimal adaptation policy when a fog node supports one or more wireless cameras. Extensive evaluation based on trace-driven simulations and experiments show that EFCam complies with the accuracy and latency requirements with lower energy consumption for a real industrial product object tracking application, compared with five baseline approaches incorporating hysteresis-based and event-triggered adaptation. |
author2 |
School of Computer Science and Engineering |
author_facet |
School of Computer Science and Engineering Zhou, Siyuan Le, Duc Van Tan, Rui Yang, Joy Qiping Ho, Daren |
format |
Article |
author |
Zhou, Siyuan Le, Duc Van Tan, Rui Yang, Joy Qiping Ho, Daren |
author_sort |
Zhou, Siyuan |
title |
Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
title_short |
Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
title_full |
Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
title_fullStr |
Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
title_full_unstemmed |
Configuration-adaptive wireless visual sensing system with deep reinforcement learning |
title_sort |
configuration-adaptive wireless visual sensing system with deep reinforcement learning |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/171724 |
_version_ |
1783955610699563008 |