EK-NNclus: A clustering procedure based on the evidential K-nearest neighbor rule
© 2015 Elsevier B.V. All rights reserved. We propose a new clustering algorithm based on the evidential K nearest-neighbor (EK-NN) rule. Starting from an initial partition, the algorithm, called EK-NNclus, iteratively reassigns objects to clusters using the EK-NN rule, until a stable partition is ob...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal |
Published: |
2018
|
Subjects: | |
Online Access: | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84941598843&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/54234 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Chiang Mai University |
Summary: | © 2015 Elsevier B.V. All rights reserved. We propose a new clustering algorithm based on the evidential K nearest-neighbor (EK-NN) rule. Starting from an initial partition, the algorithm, called EK-NNclus, iteratively reassigns objects to clusters using the EK-NN rule, until a stable partition is obtained. After convergence, the cluster membership of each object is described by a Dempster-Shafer mass function assigning a mass to each cluster and to the whole set of clusters. The mass assigned to the set of clusters can be used to identify outliers. The method can be implemented in a competitive Hopfield neural network, whose energy function is related to the plausibility of the partition. The procedure can thus be seen as searching for the most plausible partition of the data. The EK-NNclus algorithm can be set up to depend on two parameters, the number K of neighbors and a scale parameter, which can be fixed using simple heuristics. The number of clusters does not need to be determined in advance. Numerical experiments with a variety of datasets show that the method generally performs better than density-based and model-based procedures for finding a partition with an unknown number of clusters. |
---|