Theses and Dissertations
Permanent URI for this collection
Browse
Browsing Theses and Dissertations by Subject "affective computing"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access An empathic avatar in task-driven human-computer interaction(Colorado State University. Libraries, 2020) Wang, Heting, author; Beveridge, Ross, advisor; Ortega, Francisco, advisor; Sharp, Julia, committee member; Peterson, Christopher, committee memberIn Human-Computer Interaction, it is difficult to give machines emotional intelligence to resemble human affects, such as the ability of empathy. This thesis presents our work of an emotionally expressive avatar named Diana that can recognize human affects, and show her empathy by using dynamic facial expressions. Diana's behaviors and facial expressions were modeled from Human-Human multimodal interactions to help to provide human-like perceptions in users. Specifically, we designed her empathic facial expressions as a linear combination of the action units in the Facial Action Coding System [1], with the action units that were previously found to improve the accuracy and judgments of human likeness. Our work studies the role of affect between a human and Diana working together in a blocks world. We first conducted an elicitation study to extract naturally occurring gestures from naive human pairs. The pair of human collaborated on a task remotely through video communication to build wooden blocks. The video footage of their interactions composed a dataset named EGGNOG [2]. We provided descriptive and statistical analysis of the affective metrics between human signalers and builders in EGGNOG. The metrics included measures of valence (positive or negative experience) and intensities of 7 basic emotions (joy, fear, disgust, anger, surprise, and contempt). We found: 1) Overall the signalers had a broader range of valence and showed more varied emotions than the builders. 2) The intensity of signalers' joy was greater than that in builders, indicating a happier signaler than a builder. 3) For individuals, the person was happier to act as a signaler in a task than act as a builder. Additionally, valence was more associated with a person's role in a task and less associated with personality traits. Other emotions were all weak and no significant difference was found between signalers and builders. To adapt to the user's affects in the later Human-Avatar interaction, we modeled Diana's empathic behaviors based upon findings in EGGNOG and the Appraisal theory [3]. We created a Demo mode of Diana whose affective states, i.e., facial expressions that simulated empathy, dynamically transitioned between 5 finite states (neutral, joy, sympathy, concentration, and confusion) with respect to the user's affects and gestures. We also created a Mimicry mode of Diana who mimicked the user's instant facial expressions. Human subject studies involving three modes of this avatar (Demo, Mimicry, and Emotionless) were conducted with 21 participants. The difference in votes from a 5-point Likert scale perception questionnaire or a NASA TLX perceived load survey was both statistically insignificant. However, compared to the Mimicry Diana and the Emotionless Diana, a descriptive analysis indicated users spent more time engaging with the empathic Diana, and both the Demo and Mimicry mode of Diana were preferred by users over the Emotionless Diana. Some participants commented about Diana's facial expressions as natural and friendly while 3 other participants were elicited uncomfortable feelings and mentioned the Uncanny Valley effect. Results indicated our approach of adding affects to Diana was perceived differently by different people and received both positive and negative feedback. Our work provided another implementable direction of the human-centered user interfaces with complex affective states. However, there was no evidence that the empathic facial expressions were more preferred by participants than the mimicked facial expressions. In the future, Diana's empathic facial expressions may be refined by modeling more human-like action unit movements with the help of deep learning networks, and the user perception in subjective reports may get improved.Item Open Access Automatically detecting task unrelated thoughts during conversations using keystroke analysis(Colorado State University. Libraries, 2022) Kuvar, Vishal Kiran, author; Blanchard, Nathaniel, advisor; Mills, Caitlin, advisor; Ben-Hur, Asa, committee member; Zhou, Wen, committee memberTask-unrelated thought (TUT), commonly known as the phenomenon of daydreaming or zoning- out, is a mental state where a person's attention moves away from the task-at-hand to self-generated thoughts. This state is extremely common yet not much is known about it during dyadic interactions. We built a model to detect when a person experiences TUTs while talking to another person through a chat platform, by analyzing their keystroke patterns. This model was able to differentiate between task-unrelated thoughts and task-related thoughts with a kappa of 0.343. This serves as a strong indicator that typing behavior is linked with mental states, task-unrelated thoughts in our case.