Estimating latent relative labeling importances for multi-label learning

In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label...

Full description

Saved in:
Bibliographic Details
Main Authors: He, Shuo, Feng, Lei, Li, Li
Other Authors: School of Computer Science and Engineering
Format: Conference or Workshop Item
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/143866
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-143866
record_format dspace
spelling sg-ntu-dr.10356-1438662020-09-28T05:23:41Z Estimating latent relative labeling importances for multi-label learning He, Shuo Feng, Lei Li, Li School of Computer Science and Engineering 2018 IEEE International Conference on Data Mining (ICDM) Engineering::Computer science and engineering Multi-label Learning Relevant Labels In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label is regrettably ignored. In this paper, we propose a novel multi-label learning approach that aims to estimate the latent labeling importances while training the inductive model simultaneously. Specifically, we present a biconvex formulation with both instance and label graph regularization, and solve this problem using an alternating way. On the one hand, the inductive model is trained by minimizing the least squares loss of fitting the latent relative labeling importances. On the other hand, the latent relative labeling importances are estimated by the modeling outputs via a specially constrained label propagation procedure. Through the mutual adaption of the inductive model training and the specially constrained label propagation, an effective multi-label learning model is therefore built by optimally estimating the latent relative labeling importances. Extensive experimental results clearly show the effectiveness of the proposed approach. Accepted version 2020-09-28T05:23:41Z 2020-09-28T05:23:41Z 2018 Conference Paper He, S., Feng, L., & Li, L. (2018). Estimating latent relative labeling importances for multi-label learning. Proceedings of 2018 IEEE International Conference on Data Mining (ICDM), 1013-1018. doi:10.1109/ICDM.2018.00127 978-1-5386-9160-1 https://hdl.handle.net/10356/143866 10.1109/ICDM.2018.00127 1013 1018 en © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ICDM.2018.00127. application/pdf
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Multi-label Learning
Relevant Labels
spellingShingle Engineering::Computer science and engineering
Multi-label Learning
Relevant Labels
He, Shuo
Feng, Lei
Li, Li
Estimating latent relative labeling importances for multi-label learning
description In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label is regrettably ignored. In this paper, we propose a novel multi-label learning approach that aims to estimate the latent labeling importances while training the inductive model simultaneously. Specifically, we present a biconvex formulation with both instance and label graph regularization, and solve this problem using an alternating way. On the one hand, the inductive model is trained by minimizing the least squares loss of fitting the latent relative labeling importances. On the other hand, the latent relative labeling importances are estimated by the modeling outputs via a specially constrained label propagation procedure. Through the mutual adaption of the inductive model training and the specially constrained label propagation, an effective multi-label learning model is therefore built by optimally estimating the latent relative labeling importances. Extensive experimental results clearly show the effectiveness of the proposed approach.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
He, Shuo
Feng, Lei
Li, Li
format Conference or Workshop Item
author He, Shuo
Feng, Lei
Li, Li
author_sort He, Shuo
title Estimating latent relative labeling importances for multi-label learning
title_short Estimating latent relative labeling importances for multi-label learning
title_full Estimating latent relative labeling importances for multi-label learning
title_fullStr Estimating latent relative labeling importances for multi-label learning
title_full_unstemmed Estimating latent relative labeling importances for multi-label learning
title_sort estimating latent relative labeling importances for multi-label learning
publishDate 2020
url https://hdl.handle.net/10356/143866
_version_ 1681058599127744512