Repository logo
 

Iterative matrix completion and topic modeling using matrix and tensor factorizations

dc.contributor.authorKassab, Lara, author
dc.contributor.authorAdams, Henry, advisor
dc.contributor.authorFosdick, Bailey, committee member
dc.contributor.authorKirby, Michael, committee member
dc.contributor.authorPeterson, Chris, committee member
dc.date.accessioned2022-01-07T11:30:21Z
dc.date.available2022-01-07T11:30:21Z
dc.date.issued2021
dc.description.abstractWith the ever-increasing access to data, one of the greatest challenges that remains is how to make sense out of this abundance of information. In this dissertation, we propose three techniques that take into account underlying structure in large-scale data to produce better or more interpretable results for machine learning tasks. One of the challenges that arise when it comes to analyzing large-scale datasets is missing values in data, which could be challenging to handle without efficient methods. We propose adjusting an iteratively reweighted least squares algorithm for low-rank matrix completion to take into account sparsity-based structure in the missing entries. We also propose an iterative gradient-projection-based implementation of the algorithm, and present numerical experiments showcasing the performance of the algorithm compared to standard algorithms. Another challenge arises while performing a (semi-)supervised learning task on high-dimensional data. We propose variants of semi-supervised nonnegative matrix factorization models and provide motivation for these models as maximum likelihood estimators. The proposed models simultaneously provide a topic model and a model for classification. We derive training methods using multiplicative updates for each new model, and demonstrate the application of these models to document classification (e.g., 20 Newsgroups dataset). Lastly, although many datasets can be represented as matrices, datasets also often arise as high-dimensional arrays, known as higher-order tensors. We show that nonnegative CANDECOMP/PARAFAC tensor decomposition successfully detects short-lasting topics in temporal text datasets, including news headlines and COVID-19 related tweets, that other popular methods such as Latent Dirichlet Allocation and Nonnegative Matrix Factorization fail to fully detect.
dc.format.mediumborn digital
dc.format.mediumdoctoral dissertations
dc.identifierKassab_colostate_0053A_16853.pdf
dc.identifier.urihttps://hdl.handle.net/10217/234258
dc.languageEnglish
dc.language.isoeng
dc.publisherColorado State University. Libraries
dc.relation.ispartof2020-
dc.rightsCopyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright.
dc.subjectdatasets
dc.subjectinterpretable results
dc.subjectmachine learning tasks
dc.subjectalgorithm
dc.subjectmatrices
dc.subjecthigher-order tensors
dc.titleIterative matrix completion and topic modeling using matrix and tensor factorizations
dc.typeText
dcterms.rights.dplaThis Item is protected by copyright and/or related rights (https://rightsstatements.org/vocab/InC/1.0/). You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
thesis.degree.disciplineMathematics
thesis.degree.grantorColorado State University
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy (Ph.D.)

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Kassab_colostate_0053A_16853.pdf
Size:
3.6 MB
Format:
Adobe Portable Document Format