A study on matrix factorization and its applications

Matrix factorizations are methods used to factorise a matrix into a product of two or more matrices. Each matrix factorizations have their own properties respectively. Matrix factorization is mostly used in image processing and recommendation systems. Both applications use high dimension matrices to...

Full description

Saved in:
Bibliographic Details
Main Author: Tang, Adrian Wen Kai
Format: Final Year Project / Dissertation / Thesis
Published: 2021
Subjects:
Online Access:http://eprints.utar.edu.my/5079/1/1705957_FYP.pdf
http://eprints.utar.edu.my/5079/
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Tunku Abdul Rahman
Description
Summary:Matrix factorizations are methods used to factorise a matrix into a product of two or more matrices. Each matrix factorizations have their own properties respectively. Matrix factorization is mostly used in image processing and recommendation systems. Both applications use high dimension matrices to calculate the result. This is where matrix factorizations are used to reduce dimension of the data set that help in reducing the computational power. In this project, we focus on Singular Value Decomposition (SVD) and Non-Negative Matrix Factorization (NMF) applied in Latent Semantic Indexing (LSI). In order to carry out the project, we first read intensively on other research papers to increase the knowledge related to SVD and NMF. We study the computational steps, properties and application in the real-world problems. Computational steps are important as it serves the basic knowledge to code it in Python. Python also consists of libraries that can be used to calculate the approximated matrix with some parameter tuning. In this project, the application that we focus on is LSI algorithm. LSI is a search algorithm where it returns a set of documents that is related to the keywords that the user searches. It required high computational power to do matrix multiplication. To solve this, we used SVD and NMF methods to reduce the matrix dimension and thus reduce the computational power. SVD performed better than NMF because SVD has the appropriate method to find the dimension to reduce whereas NMF does not have that kind of method. In the future, we can find methods that can improve the current results.