Sparse Sequential Generalization of K-means for dictionary training on noisy signals
Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However,...
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/82295 http://hdl.handle.net/10220/43516 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time. |
---|