Repository logo
 

Redundant complexity in deep learning: an efficacy analysis of NeXtVLAD in NLP

dc.contributor.authorMahdipour Saravani, Sina, author
dc.contributor.authorRay, Indrakshi, advisor
dc.contributor.authorBanerjee, Ritwik, advisor
dc.contributor.authorSimske, Steven, committee member
dc.date.accessioned2022-08-29T10:16:03Z
dc.date.available2022-08-29T10:16:03Z
dc.date.issued2022
dc.description.abstractWhile deep learning is prevalent and successful, partly due to its extensive expressive power with less human intervention, it may inherently promote a naive and negatively simplistic employment, giving rise to problems in sustainability, reproducibility, and design. Larger, more compute-intensive models entail costs in these areas. In this thesis, we probe the effect of a neural component -- specifically, an architecture called NeXtVLAD -- on predictive accuracy for two downstream natural language processing tasks -- context-dependent sarcasm detection and deepfake text detection, and find it ineffective and redundant. We specifically investigate the extent to which this novel architecture contributes to the results, and find that it does not provide statistically significant benefits. This is only one of the several directions in efficiency-aware research in deep learning, but is especially important due to introducing an aspect of interpretability that targets design and efficiency, ergo, promotes studying architectures and topologies in deep learning to both ablate the redundant components for enhancement in sustainability, and to earn further insights into the information flow in deep neural architectures, and into the role of each and every component. We hope our insights highlighting the lack of benefits from introducing a resource-intensive component will aid future research to distill the effective elements from long and complex pipelines, thereby providing a boost to the wider research community.
dc.format.mediumborn digital
dc.format.mediummasters theses
dc.identifierMahdipourSaravani_colostate_0053N_17317.pdf
dc.identifier.urihttps://hdl.handle.net/10217/235603
dc.languageEnglish
dc.language.isoeng
dc.publisherColorado State University. Libraries
dc.relation.ispartof2020-
dc.rightsCopyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright.
dc.subjectNeXtVLAD
dc.subjectredundancy
dc.subjectNLP
dc.subjectdeep learning
dc.titleRedundant complexity in deep learning: an efficacy analysis of NeXtVLAD in NLP
dc.typeText
dcterms.rights.dplaThis Item is protected by copyright and/or related rights (https://rightsstatements.org/vocab/InC/1.0/). You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
thesis.degree.disciplineComputer Science
thesis.degree.grantorColorado State University
thesis.degree.levelMasters
thesis.degree.nameMaster of Science (M.S.)

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
MahdipourSaravani_colostate_0053N_17317.pdf
Size:
356.61 KB
Format:
Adobe Portable Document Format