Sparse Sequential Generalization of K-means for dictionary training on noisy signals
Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However,...
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/82295 http://hdl.handle.net/10220/43516 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-82295 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-822952020-03-07T14:02:38Z Sparse Sequential Generalization of K-means for dictionary training on noisy signals Sahoo, Sujit Kumar Makur, Anamitra School of Electrical and Electronic Engineering Denoising Sparse representation Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time. MOE (Min. of Education, S’pore) Accepted version 2017-08-02T01:52:38Z 2019-12-06T14:52:43Z 2017-08-02T01:52:38Z 2019-12-06T14:52:43Z 2016 Journal Article Sahoo, S. K., & Makur, A. (2016). Sparse Sequential Generalization of K-means for dictionary training on noisy signals. Signal Processing, 129, 62-66. 0165-1684 https://hdl.handle.net/10356/82295 http://hdl.handle.net/10220/43516 10.1016/j.sigpro.2016.05.036 en Signal Processing © 2016 Elsevier. This is the author created version of a work that has been peer reviewed and accepted for publication by Signal Processing, Elsevier. It incorporates referee’s comments but changes resulting from the publishing process, such as copyediting, structural formatting, may not be reflected in this document. The published version is available at: [http://dx.doi.org/10.1016/j.sigpro.2016.05.036]. 15 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
Denoising Sparse representation |
spellingShingle |
Denoising Sparse representation Sahoo, Sujit Kumar Makur, Anamitra Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
description |
Noise incursion is an inherent problem in dictionary training on noisy samples. Therefore, enforcing a structural constrain on the dictionary will be useful for a stable dictionary training. Recently, a sparse dictionary with predefined sparsity has been proposed as a structural constraint. However, a fixed sparsity can become too rigid to adapt to the training samples. In order to address this issue, this article proposes a better solution through sparse Sequential Generalization of K-means (SGK). The beauty of the sparse-SGK is that it does not enforce a predefined rigid structure on the dictionary. Instead, a flexible sparse structure automatically emerges out of the training samples depending on the amount of noise. In addition, a variation of sparse-SGK using an orthogonal base dictionary is proposed for a quicker training. The advantages of sparse-SGK are demonstrated via 3-D image denoising. The experimental results confirm that sparse-SGK has better denoising performance and it takes lesser training time. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Sahoo, Sujit Kumar Makur, Anamitra |
format |
Article |
author |
Sahoo, Sujit Kumar Makur, Anamitra |
author_sort |
Sahoo, Sujit Kumar |
title |
Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
title_short |
Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
title_full |
Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
title_fullStr |
Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
title_full_unstemmed |
Sparse Sequential Generalization of K-means for dictionary training on noisy signals |
title_sort |
sparse sequential generalization of k-means for dictionary training on noisy signals |
publishDate |
2017 |
url |
https://hdl.handle.net/10356/82295 http://hdl.handle.net/10220/43516 |
_version_ |
1681036940743278592 |