Learning-based energy-efficient resource management by heterogeneous RF/VLC for ultra-reliable low-latency industrial IoT networks

Smart factory under Industry 4.0 and industrial Internet of Things (IoT) has attracted much attention from both academia and industry. In wireless industrial networks, industrial IoT and IoT devices have different quality-of-service (QoS) requirements, ranging from ultra-reliable low-latency communi...

全面介紹

Saved in:
書目詳細資料
Main Authors: Yang, Helin, Alphones, Arokiaswami, Zhong, Wen-De, Chen, Chen, Xie, Xianzhong
其他作者: School of Electrical and Electronic Engineering
格式: Article
語言:English
出版: 2020
主題:
在線閱讀:https://hdl.handle.net/10356/142892
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:Smart factory under Industry 4.0 and industrial Internet of Things (IoT) has attracted much attention from both academia and industry. In wireless industrial networks, industrial IoT and IoT devices have different quality-of-service (QoS) requirements, ranging from ultra-reliable low-latency communications (URLLC) to high transmission data rates. These industrial networks will be highly complex and heterogeneous, as well as the spectrum and energy resources are severely limited. Hence, this article presents a heterogeneous radio frequency (RF)/visible light communication (VLC) industrial network architecture to guarantee the different QoS requirements, where RF is capable of offering wide-area coverage and VLC has the ability to provide high transmission data rate. A joint uplink and downlink energy-efficient resource management decision-making problem (network selection, subchannel assignment, and power management) is formulated as a Markov decision process. In addition, a new deep post-decision state (PDS)-based experience replay and transfer (PDS-ERT) reinforcement learning algorithm is proposed to learn the optimal policy. Simulation results corroborate the superiority in performance of the presented heterogeneous network, and verify that the proposed PDS-ERT learning algorithm outperforms other existing algorithms in terms of meeting the energy efficiency and the QoS requirements.