Matrix completion from any given set of observations

In the matrix completion problem the aim is to recover an unknown real matrix from a subset of its entries. This problem comes up in many application areas, and has received a great deal of attention in the context of the netflix prize. A central approach to this problem is to output a matrix of low...

Full description

Saved in:
Bibliographic Details
Main Authors: Lee, Troy, Shraibman, Adi
Other Authors: School of Physical and Mathematical Sciences
Format: Conference or Workshop Item
Language:English
Published: 2014
Subjects:
Online Access:https://hdl.handle.net/10356/106459
http://hdl.handle.net/10220/24021
http://nips.cc/Conferences/2013/Program/event.php?ID=3932
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-106459
record_format dspace
spelling sg-ntu-dr.10356-1064592023-02-28T19:17:32Z Matrix completion from any given set of observations Lee, Troy Shraibman, Adi School of Physical and Mathematical Sciences Annual Conference on Neural Information Processing Systems, NIPS (27th:2013) DRNTU::Science::Mathematics In the matrix completion problem the aim is to recover an unknown real matrix from a subset of its entries. This problem comes up in many application areas, and has received a great deal of attention in the context of the netflix prize. A central approach to this problem is to output a matrix of lowest possible complexity (e.g. rank or trace norm) that agrees with the partially specified matrix. The performance of this approach under the assumption that the revealed entries are sampled randomly has received considerable attention (e.g. [1, 2, 3, 4, 5, 6, 7, 8]). In practice, often the set of revealed entries is not chosen at random and these results do not apply. We are therefore left with no guarantees on the performance of the algorithm we are using. We present a means to obtain performance guarantees with respect to any set of initial observations. The first step remains the same: find a matrix of lowest possible complexity that agrees with the partially specified matrix. We give a new way to interpret the output of this algorithm by next finding a probability distribution over the non-revealed entries with respect to which a bound on the generalization error can be proven. The more complex the set of revealed entries according to a certain measure, the better the bound on the generalization error. Published version 2014-10-13T08:18:24Z 2019-12-06T22:12:15Z 2014-10-13T08:18:24Z 2019-12-06T22:12:15Z 2013 2013 Conference Paper Lee, T., & Shraibman, A. (2013). Matrix completion from any given set of observations. Advances in neural information processing systems, 1-7. https://hdl.handle.net/10356/106459 http://hdl.handle.net/10220/24021 http://nips.cc/Conferences/2013/Program/event.php?ID=3932 en © 2013 Massachusetts Institute of Technology Press. This paper was published in Advances in Neural Information Processing Systems and is made available as an electronic reprint (preprint) with permission of Massachusetts Institute of Technology Press. The paper can be found at the following official URL: [http://nips.cc/Conferences/2013/Program/event.php?ID=3932].  One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper is prohibited and is subject to penalties under law. 7 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Science::Mathematics
spellingShingle DRNTU::Science::Mathematics
Lee, Troy
Shraibman, Adi
Matrix completion from any given set of observations
description In the matrix completion problem the aim is to recover an unknown real matrix from a subset of its entries. This problem comes up in many application areas, and has received a great deal of attention in the context of the netflix prize. A central approach to this problem is to output a matrix of lowest possible complexity (e.g. rank or trace norm) that agrees with the partially specified matrix. The performance of this approach under the assumption that the revealed entries are sampled randomly has received considerable attention (e.g. [1, 2, 3, 4, 5, 6, 7, 8]). In practice, often the set of revealed entries is not chosen at random and these results do not apply. We are therefore left with no guarantees on the performance of the algorithm we are using. We present a means to obtain performance guarantees with respect to any set of initial observations. The first step remains the same: find a matrix of lowest possible complexity that agrees with the partially specified matrix. We give a new way to interpret the output of this algorithm by next finding a probability distribution over the non-revealed entries with respect to which a bound on the generalization error can be proven. The more complex the set of revealed entries according to a certain measure, the better the bound on the generalization error.
author2 School of Physical and Mathematical Sciences
author_facet School of Physical and Mathematical Sciences
Lee, Troy
Shraibman, Adi
format Conference or Workshop Item
author Lee, Troy
Shraibman, Adi
author_sort Lee, Troy
title Matrix completion from any given set of observations
title_short Matrix completion from any given set of observations
title_full Matrix completion from any given set of observations
title_fullStr Matrix completion from any given set of observations
title_full_unstemmed Matrix completion from any given set of observations
title_sort matrix completion from any given set of observations
publishDate 2014
url https://hdl.handle.net/10356/106459
http://hdl.handle.net/10220/24021
http://nips.cc/Conferences/2013/Program/event.php?ID=3932
_version_ 1759855284268826624