Personalized reviews generation for explainable recommendations

In recent years, the recommendation community is increasingly paying attention to the interpretability of recommendations. Due to the black box feature of the recommendation system, users usually do not understand the reason for passively obtaining the recommendation results, which will directly aff...

Full description

Saved in:
Bibliographic Details
Main Author: Li, Ling
Other Authors: Alex Chichung Kot
Format: Thesis-Master by Research
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/169445
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:In recent years, the recommendation community is increasingly paying attention to the interpretability of recommendations. Due to the black box feature of the recommendation system, users usually do not understand the reason for passively obtaining the recommendation results, which will directly affect users' satisfaction with the system. Existing works usually suffer from ground truth information leaking because they require an aspect word from ground truth to steer the generating processing. To remedy this problem, we propose a BERT-guided generator for explainable recommendations named ExBERT, which can generate reliable reviews only from user/item IDs and their review history. A self-attention mechanism encoder is adopted to explore user and item review history. Moreover, We adapt the BERT-NSP task in our decoder as a contrastive sentence-level classifier, which distinguishes between the full-sentence meanings of the positive and negative samples. Moreover, the sentences generated by most of the existing works are too general (e.g., “The product is great”) rather than containing fine-grained words, we propose a Diffusion Model-based Review Generation towards EXplainable Recommendation named Diffusion-EXR. Diffusion-EXR corrupts the sequence of review embeddings by incrementally introducing varied levels of Gaussian noise to the sequence of word embeddings and learns to reconstruct the original word representations in the reverse process. The nature of DDPM enables our lightweight Transformer backbone to perform well in the recommendation review generation task. Extensive experiments have shown that ExBERT and Diffusion-EXR are effective and significantly outperform state-of-the-art baselines on two real-world explainable recommendation benchmark datasets (i.e., Amazon-Clothing Shoes Jewellery and TripAdvisor).