Ernst, Matthew Frederick, authorWhitley, Darrell, advisorAnderson, Chuck, committee memberBuchanan, Norm, committee member2022-05-302022-05-302022https://hdl.handle.net/10217/235155The rectified linear unit (ReLU) activation function has been a staple tool in deep learning to increase the performance of deep neural network architectures. However, the ReLU activation function has trade-offs with its performance, specifically the dead ReLU problem caused by vanishing gradients. In this thesis, we introduce "late residual connections" a type of residual neural network with connections from each hidden layer connected directly to the output layer of a network. These residual connections improve convergence for neural networks by allowing more gradient flow to the hidden layers of a network.born digitalmasters thesesengCopyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright.residual connectionsdead ReLULate residual neural networks: an approach to combat the dead ReLU problemText