CenKNN: A scalable and effective text classifier

A big challenge in text classification is to perform classification on a large-scale and high-dimensional text corpus in the presence of imbalanced class distributions and a large number of irrelevant or noisy term features. A number of techniques have been proposed to handle this challenge with var...

Full description

Saved in:
Bibliographic Details
Main Authors: PANG, Guansong, JIN, Huidong, JIANG, Shengyi
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2014
Subjects:
KNN
Online Access:https://ink.library.smu.edu.sg/sis_research/7027
https://ink.library.smu.edu.sg/context/sis_research/article/8030/viewcontent/Pang2015_Article_CenKNNAScalableAndEffectiveTex.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:A big challenge in text classification is to perform classification on a large-scale and high-dimensional text corpus in the presence of imbalanced class distributions and a large number of irrelevant or noisy term features. A number of techniques have been proposed to handle this challenge with varying degrees of success. In this paper, by combining the strengths of two widely used text classification techniques, K-Nearest-Neighbor (KNN) and centroid based (Centroid) classifiers, we propose a scalable and effective flat classifier, called CenKNN, to cope with this challenge. CenKNN projects high-dimensional (often hundreds of thousands) documents into a low-dimensional (normally a few dozen) space spanned by class centroids, and then uses the \(k\)-d tree structure to find \(K\) nearest neighbors efficiently. Due to the strong representation power of class centroids, CenKNN overcomes two issues related to existing KNN text classifiers, i.e., sensitivity to imbalanced class distributions and irrelevant or noisy term features. By working on projected low-dimensional data, CenKNN substantially reduces the expensive computation time in KNN. CenKNN also works better than Centroid since it uses all the class centroids to define similarity and works well on complex data, i.e., non-linearly separable data and data with local patterns within each class. A series of experiments on both English and Chinese, benchmark and synthetic corpora demonstrates that although CenKNN works on a significantly lower-dimensional space, it performs substantially better than KNN and its five variants, and existing scalable classifiers, including Centroid and Rocchio. CenKNN is also empirically preferable to another well-known classifier, support vector machines, on highly imbalanced corpora with a small number of classes.