Sparse Sequential Generalization of K-means for dictionary training on noisy signals

Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However,...

全面介紹

Saved in:
書目詳細資料
Main Authors: Sahoo, Sujit Kumar, Makur, Anamitra
其他作者: School of Electrical and Electronic Engineering
格式: Article
語言:English
出版: 2017
主題:
在線閱讀:https://hdl.handle.net/10356/82295
http://hdl.handle.net/10220/43516
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time.