Robust factorization machine: A doubly capped norms minimization
Factorization Machine (FM) is a general supervised learning framework for many AI applications due to its powerful capability of feature engineering. Despite being extensively studied, existing FM methods have several limitations in common. First of all, most existing FM methods often adopt the squa...
Saved in:
Main Authors: | , , , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2019
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/4389 https://ink.library.smu.edu.sg/context/sis_research/article/5392/viewcontent/SDM19_RFM.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-5392 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-53922020-03-24T06:01:40Z Robust factorization machine: A doubly capped norms minimization LIU, Chenghao ZHANG, Teng LI, Jundong YIN, Jianwen ZHAO, Peilin SUN, Jianling HOI, Steven C. H. Factorization Machine (FM) is a general supervised learning framework for many AI applications due to its powerful capability of feature engineering. Despite being extensively studied, existing FM methods have several limitations in common. First of all, most existing FM methods often adopt the squared loss in the modeling process, which can be very sensitive when the data for learning contains noises and outliers. Second, some recent FM variants often explore the low-rank structure of the feature interactions matrix by relaxing the low-rank minimization problem as a trace norm minimization, which cannot always achieve a tight approximation to the original one. To address the aforementioned issues, this paper proposes a new scheme of Robust Factorization Machine (RFM) by exploring a doubly capped norms minimization approach, which employs both a capped squared trace norm in achieving a tighter approximation of the rank minimization and a capped ℓ1-norm loss to enhance the robustness of the empirical loss minimization from noisy data. We develop an efficient algorithm with a rigorous convergence proof of RFM. Experiments on public real-world datasets show that our method outperforms the state-of-the-art FM methods significantly. 2019-05-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/4389 info:doi/10.1137/1.9781611975673.83 https://ink.library.smu.edu.sg/context/sis_research/article/5392/viewcontent/SDM19_RFM.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Factorization machines Feature engineerings Feature interactions Loss minimization Modeling process Rank minimizations Real-world datasets State of the art Artificial Intelligence and Robotics Databases and Information Systems Software Engineering |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Factorization machines Feature engineerings Feature interactions Loss minimization Modeling process Rank minimizations Real-world datasets State of the art Artificial Intelligence and Robotics Databases and Information Systems Software Engineering |
spellingShingle |
Factorization machines Feature engineerings Feature interactions Loss minimization Modeling process Rank minimizations Real-world datasets State of the art Artificial Intelligence and Robotics Databases and Information Systems Software Engineering LIU, Chenghao ZHANG, Teng LI, Jundong YIN, Jianwen ZHAO, Peilin SUN, Jianling HOI, Steven C. H. Robust factorization machine: A doubly capped norms minimization |
description |
Factorization Machine (FM) is a general supervised learning framework for many AI applications due to its powerful capability of feature engineering. Despite being extensively studied, existing FM methods have several limitations in common. First of all, most existing FM methods often adopt the squared loss in the modeling process, which can be very sensitive when the data for learning contains noises and outliers. Second, some recent FM variants often explore the low-rank structure of the feature interactions matrix by relaxing the low-rank minimization problem as a trace norm minimization, which cannot always achieve a tight approximation to the original one. To address the aforementioned issues, this paper proposes a new scheme of Robust Factorization Machine (RFM) by exploring a doubly capped norms minimization approach, which employs both a capped squared trace norm in achieving a tighter approximation of the rank minimization and a capped ℓ1-norm loss to enhance the robustness of the empirical loss minimization from noisy data. We develop an efficient algorithm with a rigorous convergence proof of RFM. Experiments on public real-world datasets show that our method outperforms the state-of-the-art FM methods significantly. |
format |
text |
author |
LIU, Chenghao ZHANG, Teng LI, Jundong YIN, Jianwen ZHAO, Peilin SUN, Jianling HOI, Steven C. H. |
author_facet |
LIU, Chenghao ZHANG, Teng LI, Jundong YIN, Jianwen ZHAO, Peilin SUN, Jianling HOI, Steven C. H. |
author_sort |
LIU, Chenghao |
title |
Robust factorization machine: A doubly capped norms minimization |
title_short |
Robust factorization machine: A doubly capped norms minimization |
title_full |
Robust factorization machine: A doubly capped norms minimization |
title_fullStr |
Robust factorization machine: A doubly capped norms minimization |
title_full_unstemmed |
Robust factorization machine: A doubly capped norms minimization |
title_sort |
robust factorization machine: a doubly capped norms minimization |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2019 |
url |
https://ink.library.smu.edu.sg/sis_research/4389 https://ink.library.smu.edu.sg/context/sis_research/article/5392/viewcontent/SDM19_RFM.pdf |
_version_ |
1770574695076200448 |