Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation

Variational Autoencoder (VAE) offers a non-linear probabilistic modeling of user's preferences. While it has achieved remarkable performance at collaborative filtering, it typically samples a single vector for representing user's preferences, which may be insufficient to capture the user&#...

Full description

Saved in:
Bibliographic Details
Main Authors: TRAN, Nhu Thuat, LAUW, Hady Wirawan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8350
https://ink.library.smu.edu.sg/context/sis_research/article/9353/viewcontent/cikm23.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9353
record_format dspace
spelling sg-smu-ink.sis_research-93532023-12-13T03:32:24Z Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation TRAN, Nhu Thuat LAUW, Hady Wirawan Variational Autoencoder (VAE) offers a non-linear probabilistic modeling of user's preferences. While it has achieved remarkable performance at collaborative filtering, it typically samples a single vector for representing user's preferences, which may be insufficient to capture the user's diverse interests. Existing solutions extend VAE to model multiple interests of users by resorting a variant of self-attentive method, i.e., employing prototypes to group items into clusters, each capturing one topic of user's interests. Despite showing improvements, the current design could be more effective since prototypes are randomly initialized and shared across users, resulting in uninformative and non-personalized clusters.To fill the gap, firstly, we introduce iterative latent attention for personalized item grouping into VAE framework to infer multiple interests of users. Secondly, we propose to incorporate implicit differentiation to improve training of our iterative refinement model. Thirdly, we study the self-attention to refine cluster prototypes for item grouping, which is largely ignored by existing works. Extensive experiments on three real-world datasets demonstrate stronger performance of our method over those of baselines.librar 2023-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8350 info:doi/10.1145/3583780.3614980 https://ink.library.smu.edu.sg/context/sis_research/article/9353/viewcontent/cikm23.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Information systems Information retrieval Recommender systems Variational Autoencoder Applied Statistics Artificial Intelligence and Robotics Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Information systems
Information retrieval
Recommender systems
Variational Autoencoder
Applied Statistics
Artificial Intelligence and Robotics
Theory and Algorithms
spellingShingle Information systems
Information retrieval
Recommender systems
Variational Autoencoder
Applied Statistics
Artificial Intelligence and Robotics
Theory and Algorithms
TRAN, Nhu Thuat
LAUW, Hady Wirawan
Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
description Variational Autoencoder (VAE) offers a non-linear probabilistic modeling of user's preferences. While it has achieved remarkable performance at collaborative filtering, it typically samples a single vector for representing user's preferences, which may be insufficient to capture the user's diverse interests. Existing solutions extend VAE to model multiple interests of users by resorting a variant of self-attentive method, i.e., employing prototypes to group items into clusters, each capturing one topic of user's interests. Despite showing improvements, the current design could be more effective since prototypes are randomly initialized and shared across users, resulting in uninformative and non-personalized clusters.To fill the gap, firstly, we introduce iterative latent attention for personalized item grouping into VAE framework to infer multiple interests of users. Secondly, we propose to incorporate implicit differentiation to improve training of our iterative refinement model. Thirdly, we study the self-attention to refine cluster prototypes for item grouping, which is largely ignored by existing works. Extensive experiments on three real-world datasets demonstrate stronger performance of our method over those of baselines.librar
format text
author TRAN, Nhu Thuat
LAUW, Hady Wirawan
author_facet TRAN, Nhu Thuat
LAUW, Hady Wirawan
author_sort TRAN, Nhu Thuat
title Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
title_short Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
title_full Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
title_fullStr Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
title_full_unstemmed Multi-representation Variational Autoencoder via iterative latent attention and implicit differentiation
title_sort multi-representation variational autoencoder via iterative latent attention and implicit differentiation
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/8350
https://ink.library.smu.edu.sg/context/sis_research/article/9353/viewcontent/cikm23.pdf
_version_ 1787136839137624064