Becker, Katherine M., authorRojas, Donald C., advisorDavies, Patricia, committee memberTroup, Lucy J., committee member2017-09-142019-09-142017https://hdl.handle.net/10217/183912The perception of another person's emotional state is formed by the intersection of simultaneously presented affective vocal and facial information. These two channels are highly effective in communicating emotion as either can do so independently. However, it is unclear how these modalities interact and influence perception when they are integrated. The current study sought to disentangle the roles of each modality by manipulating both the vocal and facial components of emotion perception. Voice stimuli were comprised of nonverbal affective vocalizations produced in either a happy, angry, or neutral prosody. Face images were created from morphed continua, composed to two end-point images, of one happy, and one angry face. These stimuli were presented independently and together to fully dissociate the unimodal and bimodal aspects of affect perception. These stimuli were combined in one hybrid block design paradigm which was used in a behavioral experiment and a functional near-infrared spectroscopy experiment. The results indicated that prosody does effect the perception of affective faces and this can be evidenced in both the behavioral and functional imaging data. Moreover, these data suggest that prosody is differentially represented in the brain in a valence specific way. Together, these findings provide strong support for the crucial role of prosody in affect perception.born digitalmasters thesesengCopyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright.Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopyText