Reconciliation of statistical and spatial sparsity for robust visual classification

Recent image classification algorithms, by learning deep features from large-scale datasets, have achieved significantly better results comparing to the classic feature-based approaches. However, there are still various challenges of image classifications in practice, such as classifying noisy image...

Full description

Saved in:
Bibliographic Details
Main Authors: Cheng, Hao, Yap, Kim-Hui, Wen, Bihan
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172865
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Recent image classification algorithms, by learning deep features from large-scale datasets, have achieved significantly better results comparing to the classic feature-based approaches. However, there are still various challenges of image classifications in practice, such as classifying noisy image or image-set queries, and training deep image classification models over the limited-scale dataset. Instead of applying generic deep features, the model-based approaches can be more effective and data-efficient for robust image and image-set classification tasks, as various image priors are exploited for modeling the inter- and intra-set data variations while preventing over-fitting. In this work, we propose a novel Joint Statistical and Spatial Sparse representation scheme, dubbed J3S, to model the image or image-set data for classification. J3S utilized joint sparse representation to reconcile both the local image structures and global Gaussian distribution mapped into Riemannian manifold. The learned J3S models are used for robust image and image-set classification tasks. Experiments show that the proposed J3S-based image classification scheme outperforms the popular or state-of-the-art competing methods over FMD, UIUC, ETH-80 and YTC databases.