Improved performance in distributed estimation by convex combination of DNSAF and DNLMS algorithms

In diffusion estimation of distributed networks two characteristic parameters are crucial, the speed of convergence and steady-state error. Diffusion normalized least mean square (DNLMS) algorithm has low misadjustment error, but it is slow in convergence. On the contrary, the diffusion normalized s...

Full description

Saved in:
Bibliographic Details
Main Authors: Ahmad Pouradabi, Amir Rastegarnia, Azam Khalili, Ali Farzamnia
Format: Proceedings
Language:English
English
Published: IEEE Xplore 2022
Subjects:
Online Access:https://eprints.ums.edu.my/id/eprint/41739/3/ABSTRACT%20%286%29.pdf
https://eprints.ums.edu.my/id/eprint/41739/2/FULL%20TEXT.pdf
https://eprints.ums.edu.my/id/eprint/41739/
https://ieeexplore.ieee.org/document/9936845
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Malaysia Sabah
Language: English
English
Description
Summary:In diffusion estimation of distributed networks two characteristic parameters are crucial, the speed of convergence and steady-state error. Diffusion normalized least mean square (DNLMS) algorithm has low misadjustment error, but it is slow in convergence. On the contrary, the diffusion normalized subband adaptive filter (DNSAF) algorithm has faster convergence than DNLMS, but final steady-state error is higher. In this paper, the overall performance is improved by combining these algorithms. Convex combination of DNLMS / DNSAF has a quick convergence rate and little steadystate error. The introduced algorithms execute tracking more effectively than traditional algorithms, in addition. We use a number of experimental findings to show how well the suggested method performs.