Machine-learning-based parallel genetic algorithms for multi-objective optimization in ultra-reliable low-latency WSNs

Different from conventional wireless sensor networks (WSNs), ultra-reliable and low-latency WSNs (uRLLWSNs), being an important application of 5G networks, must meet more stringent performance requirements. In this paper, we propose a novel algorithm to improve uRLLWSNs’ performance by applying mach...

Full description

Saved in:
Bibliographic Details
Main Authors: Chang, Yuchao, Yuan, Xiaobing, Niyato, Dusit, Al-Dhahir, Naofal, Li, Baoqing
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2019
Subjects:
Online Access:https://hdl.handle.net/10356/104803
http://hdl.handle.net/10220/48646
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Different from conventional wireless sensor networks (WSNs), ultra-reliable and low-latency WSNs (uRLLWSNs), being an important application of 5G networks, must meet more stringent performance requirements. In this paper, we propose a novel algorithm to improve uRLLWSNs’ performance by applying machine learning techniques and genetic algorithms. Using the K-means clustering algorithm to construct a 2-tier network topology, the proposed algorithm designs the fetal dataset, denoted by the population, and develops a clustering method of energy conversion to prevent overloaded cluster heads. A multi-objective optimization model is formulated to simultaneously satisfy multiple optimization objectives including the longest network lifetime and the highest network connectivity and reliability. Under this model, the principal component analysis algorithm is adopted to eliminate the various optimization objectives’ dependencies and rank their importance levels. Considering the NP-hardness of wireless network scheduling, the genetic algorithm is used to identify the optimal chromosome for designing a near-optimal clustering network topology. Moreover, we prove the convergence of the proposed algorithm both locally and globally. Simulation results are presented to demonstrate the viability of the proposed algorithm compared to state-of-the-art algorithms at an acceptable computational complexity.