Deep-reinforcement-learning-based energy-efficient resource management for social and cognitive Internet of Things
Internet of things (IoT) has attracted much interest due to its wide applications such as smart city, manufacturing, transportation, and healthcare. Social and cognitive IoT is capable of exploiting the social networking characteristics to optimize the network performance. Considering the fact that...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/142885 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Internet of things (IoT) has attracted much interest due to its wide applications such as smart city, manufacturing, transportation, and healthcare. Social and cognitive IoT is capable of exploiting the social networking characteristics to optimize the network performance. Considering the fact that IoT devices have different quality of service (QoS) requirements (ranging from ultra-reliable and low-latency communications (URLLC) to minimum data rate), this paper presents a QoS-driven social-aware enhanced device-to-device (D2D) communication network model for social and cognitive IoT by utilizing social orientation information. We model the optimization problem as a multi-agent reinforcement learning formulation, and a novel coordinated multi-agent deep reinforcement learning based resource management approach is proposed to optimize the joint radio block assignment and transmission power control strategy. Meanwhile, prioritized experience replay (PER) and coordinated learning mechanisms are employed to enable communication links to work cooperatively in a distributed manner, which enhances the network performance and access success probability. Simulation results corroborate the superiority in the performance of the presented resource management approach, and it outperforms other existing approaches in terms of meeting the energy efficiency and the QoS requirements. |
---|