Kentopp, Shane, authorConner, Bradley T., advisorPrince, Mark A., committee memberHenry, Kimberly L., committee memberAnderson, Charles W., committee member2021-09-062021-09-062021https://hdl.handle.net/10217/233832Binge drinking and non-suicidal self-injury are significant health-risk behaviors that are often initiated during adolescence and contribute to a host of negative outcomes later in life. Selective prevention strategies are targeted toward individuals most at-risk for developing these behaviors. Traditionally, selective interventions are tailored based on risk factors identified by human experts. Machine learning algorithms, such as deep neural networks, may improve the effectiveness of selective interventions by accounting for complex interactions between large numbers of predictor variables. However, their use in psychological research is limited due to the tendency to overfit and the need for large volumes of training data. Deep transfer learning can overcome this limitation by leveraging samples of convenience to facilitate training deep neural networks in small, clinically relevant samples. The author trained deep neural networks on data from a sample of adolescent psychiatric inpatients to retrospectively classify individuals according to their history of alcohol misuse and nonsuicidal self-injury. Next, the performance of these models was compared to deep neural networks that were pretrained in a convenience sample of college undergraduates and fine-tuned in the sample of psychiatric patients. Deep transfer learning did not improve classification accuracy but buffered against overfitting. The deep neural networks that were not pretrained maintained maximum classification accuracy for a very small number of training epochs before performance deteriorated due to overfitting the training data. Conversely, the pretrained networks maintained their maximum classification accuracy across many training epochs and performance was not hindered by overfitting. This suggests that convenience samples can be utilized to reduce the risk of overfitting when training complex deep neural networks on small clinical samples. In the future, this process may be employed to facilitate powerful predictive models that inform selective prevention programs and contribute to the reduction of health risk behavior prevalence amongst vulnerable adolescent populations.born digitaldoctoral dissertationsengCopyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright.deep learningself-injurybinge drinkingtransfer learningdeep neural networksDeep transfer learning for prediction of health risk behaviors in adolescent psychiatric patientsText