Tensor factorization for low-rank tensor completion

Recently, a tensor nuclear norm (TNN) based method [1] was proposed to solve the tensor completion problem, which has achieved state-of-the-art performance on image and video inpainting tasks. However, it requires computing tensor singular value decomposition (t-SVD), which costs much computation an...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHOU, Pan, LU, Canyi, LIN, Zhouchen, ZHANG, Chao
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2017
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9057
https://ink.library.smu.edu.sg/context/sis_research/article/10060/viewcontent/2017_TIP_TCTF.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10060
record_format dspace
spelling sg-smu-ink.sis_research-100602024-08-01T15:36:17Z Tensor factorization for low-rank tensor completion ZHOU, Pan LU, Canyi LIN, Zhouchen ZHANG, Chao Recently, a tensor nuclear norm (TNN) based method [1] was proposed to solve the tensor completion problem, which has achieved state-of-the-art performance on image and video inpainting tasks. However, it requires computing tensor singular value decomposition (t-SVD), which costs much computation and thus cannot efficiently handle tensor data, due to its natural large scale. Motivated by TNN, we propose a novel low-rank tensor factorization method for efficiently solving the 3-way tensor completion problem. Our method preserves the lowrank structure of a tensor by factorizing it into the product of two tensors of smaller sizes. In the optimization process, our method only needs to update two smaller tensors, which can be more efficiently conducted than computing t-SVD. Furthermore, we prove that the proposed alternating minimization algorithm can converge to a Karush-Kuhn-Tucker (KKT) point. Experimental results on the synthetic data recovery, image and video inpainting tasks clearly demonstrate the superior performance and efficiency of our developed method over state-of-the-arts including the TNN [1] and matricization methods [2]–[5]. 2017-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9057 info:doi/10.1109/TIP.2017.2762595 https://ink.library.smu.edu.sg/context/sis_research/article/10060/viewcontent/2017_TIP_TCTF.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Tensor Factorization Tensor Completion Lowrank Factorization Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Tensor Factorization
Tensor Completion
Lowrank Factorization
Databases and Information Systems
spellingShingle Tensor Factorization
Tensor Completion
Lowrank Factorization
Databases and Information Systems
ZHOU, Pan
LU, Canyi
LIN, Zhouchen
ZHANG, Chao
Tensor factorization for low-rank tensor completion
description Recently, a tensor nuclear norm (TNN) based method [1] was proposed to solve the tensor completion problem, which has achieved state-of-the-art performance on image and video inpainting tasks. However, it requires computing tensor singular value decomposition (t-SVD), which costs much computation and thus cannot efficiently handle tensor data, due to its natural large scale. Motivated by TNN, we propose a novel low-rank tensor factorization method for efficiently solving the 3-way tensor completion problem. Our method preserves the lowrank structure of a tensor by factorizing it into the product of two tensors of smaller sizes. In the optimization process, our method only needs to update two smaller tensors, which can be more efficiently conducted than computing t-SVD. Furthermore, we prove that the proposed alternating minimization algorithm can converge to a Karush-Kuhn-Tucker (KKT) point. Experimental results on the synthetic data recovery, image and video inpainting tasks clearly demonstrate the superior performance and efficiency of our developed method over state-of-the-arts including the TNN [1] and matricization methods [2]–[5].
format text
author ZHOU, Pan
LU, Canyi
LIN, Zhouchen
ZHANG, Chao
author_facet ZHOU, Pan
LU, Canyi
LIN, Zhouchen
ZHANG, Chao
author_sort ZHOU, Pan
title Tensor factorization for low-rank tensor completion
title_short Tensor factorization for low-rank tensor completion
title_full Tensor factorization for low-rank tensor completion
title_fullStr Tensor factorization for low-rank tensor completion
title_full_unstemmed Tensor factorization for low-rank tensor completion
title_sort tensor factorization for low-rank tensor completion
publisher Institutional Knowledge at Singapore Management University
publishDate 2017
url https://ink.library.smu.edu.sg/sis_research/9057
https://ink.library.smu.edu.sg/context/sis_research/article/10060/viewcontent/2017_TIP_TCTF.pdf
_version_ 1814047719532003328