Repository logo
 

Iterative matrix completion and topic modeling using matrix and tensor factorizations

Date

2021

Authors

Kassab, Lara, author
Adams, Henry, advisor
Fosdick, Bailey, committee member
Kirby, Michael, committee member
Peterson, Chris, committee member

Journal Title

Journal ISSN

Volume Title

Abstract

With the ever-increasing access to data, one of the greatest challenges that remains is how to make sense out of this abundance of information. In this dissertation, we propose three techniques that take into account underlying structure in large-scale data to produce better or more interpretable results for machine learning tasks. One of the challenges that arise when it comes to analyzing large-scale datasets is missing values in data, which could be challenging to handle without efficient methods. We propose adjusting an iteratively reweighted least squares algorithm for low-rank matrix completion to take into account sparsity-based structure in the missing entries. We also propose an iterative gradient-projection-based implementation of the algorithm, and present numerical experiments showcasing the performance of the algorithm compared to standard algorithms. Another challenge arises while performing a (semi-)supervised learning task on high-dimensional data. We propose variants of semi-supervised nonnegative matrix factorization models and provide motivation for these models as maximum likelihood estimators. The proposed models simultaneously provide a topic model and a model for classification. We derive training methods using multiplicative updates for each new model, and demonstrate the application of these models to document classification (e.g., 20 Newsgroups dataset). Lastly, although many datasets can be represented as matrices, datasets also often arise as high-dimensional arrays, known as higher-order tensors. We show that nonnegative CANDECOMP/PARAFAC tensor decomposition successfully detects short-lasting topics in temporal text datasets, including news headlines and COVID-19 related tweets, that other popular methods such as Latent Dirichlet Allocation and Nonnegative Matrix Factorization fail to fully detect.

Description

Rights Access

Subject

datasets
interpretable results
machine learning tasks
algorithm
matrices
higher-order tensors

Citation

Associated Publications