Browsing by Author "Draper, Bruce A., committee member"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Open Access Attentional biases and time course of emotion processing in depression(Colorado State University. Libraries, 2014) Bastidas, Stephanie P., author; Troup, Lucy J., advisor; Henry, Kimberly L., committee member; Draper, Bruce A., committee memberDepressive mood is associated with differential patterns in emotion processing, but it is unclear which stages of processing differ in depressed individuals. The current study explored the nature of biases in early vs. late components of attention. Experiment 1 examined attention biases in orienting to and disengaging from positive and negative emotional stimuli behaviorally. Depressed participants presented greater overall biases than controls in the dot-discrimination but not in the dot-detection task. Positive and negative affect were associated with greater orienting bias and reduced disengaging bias for happy faces in the detection task and smaller bias for happy faces and greater for sad faces in the discrimination task. Experiment 2 explored differences in the time course of emotion processing, with focus on early P3 component differences during implicit and explicit processing. Results showed greater P3 for happy than neutral trials over midline frontal electrodes and the opposite pattern over parietal electrodes in depressed but not control participants during implicit processing. P3 was slower in depressed than controls during explicit processing over lateral sites. Midline electrodes showed slower P3 for happy than neutral during implicit processing and for sad than neutral during explicit, independent of group. Results suggest the presence of attentional biases in depressed individuals independent of emotion. These biases might be better reflected during intentional than incidental emotion processing. Future study is needed to fully understand the relationship of emotion processing for different degrees of depressive symptoms, emotions, and with regard to other modalities of intention in emotion processing.Item Open Access Discovering and harnessing structures in solving application satisfiability instances(Colorado State University. Libraries, 2018) Chen, Wenxiang, author; Whitley, L. Darrell, advisor; Draper, Bruce A., committee member; Böhm, A. P. Wim, committee member; Chong, Edwin K. P., committee memberBoolean satisfiability (SAT) is the first problem proven to be NP-Complete. It has become a fundamental problem for computational complexity theory, and many real-world problems can be encoded as SAT instances. Two major search paradigms have been proposed for SAT solving: Systematic Search (SS) and Stochastic Local Search (SLS). SLS solvers have been shown to be very effective at uniform random instances; SLS solvers are consistently the top winning entries for random tracks at SAT competitions. However, SS solvers dominate hard combinatorial tracks and industrial tracks at SAT competitions, with SLS entries being at the very bottom of the ranking. In this work, we classify both hard combinatorial instances and industrial instances as Application Instances. As application instances are more interesting from a practical perspective, it is critical to analyze the structures in application instances as well as to improve SLS on application instances. We focus on two structural properties of SAT instances in this work: variable interaction topology and subproblem constrainedness. Decomposability focuses on how well the variable interaction of an application instance can be decomposed. We first show that many application instances are indeed highly decomposable. The decomposability of a SAT instance have been extensively exploited with success by SS solvers. Meanwhile, SLS solvers direct the variables to flip using only the objective function, and are completely oblivious of the decomposability of application instances that is inherent to the original problem domain. We propose a new method to decompose variable interactions within SLS solvers, leveraging numerous visited local optima. Our empirical study suggests that the proposed method can vastly simplify SAT instances, which further results in decomposing the instances into thousands of connected components. Furthermore, we demonstrate the utility of the decomposition, in improving SLS solvers. We propose a new framework called PXSAT, based on the recombination operator Partition Crossover (PX). Given q components, PX is able to find the best of 2q possible candidate solutions in linear time. Empirical results on an extensive set of application instances show PXSAT can yield statistically significantly better results. We improve two of best local search solvers, AdaptG2WSAT and Sparrow. PXSAT combined with AdaptG2WSAT is also able to outperform CCLS, winners of several recent MAXSAT competitions. The other structural property we study is subproblem constrainedness. We observe that, on some application SAT instance classes, the original problem can be partitioned into several subproblems, where each subproblems is highly constrained. While subproblem constrainedness has been exploited in SS solvers before, we propose to exploit it in SLS solvers using two alternative representations that can be obtained efficiently based on the canonical CNF representation. Our empirical results show that the new alternative representative enables a simple SLS solver to outperform several sophisticated and highly optimized SLS solvers on the SAT encoding of semiprime factoring problem.Item Open Access Event-related potentials for the implicit and explicit processing of emotional facial expressions as basic level- and subordinate level-stimulus categories(Colorado State University. Libraries, 2014) Nomi, Jason S., author; Troup, Lucy J., advisor; Davalos, Deana, advisor; Kraiger, Kurt, committee member; Draper, Bruce A., committee memberThe two dominant models in face perception propose independent mechanisms are responsible for initial face perception (discriminating a face from an object), identity recognition (recognizing a specific face) and emotional expression perception (processing of an expression). However, Bruce and Young (1986) propose a linear model where identity recognition and expression perception operate in a parallel manner after initial face perception while Haxby, Hoffman and Gobbini (2000) propose an interactive model where all three mechanisms interact with each other within a non-linear core system. Event related potentials (ERPs) demonstrate that initial face perception is reflected by the temporal occipital P1 and N170 while identity recognition is reflected by the anterior N250. Some studies have found an expression influence on the P1 and N170 while other studies have not, providing mixed support for either model. The current study examined how facilitation of basic level and subordinate level category processing of emotional expressions may have influenced the results of previous studies. Research in stimulus category processing demonstrates that faces are typically processed at the subordinate level (e.g. my friend "Joe" as opposed to the basic level of "face") while objects are processed at the basic level (e.g. car but not the subordinate level of "Nissan Sentra"). However, there has been little research exploring how the processing of expressions may be influenced by category processing. Happy, neutral and sad expressions were presented in isolation for Experiment 1 to facilitate processing of expressions on the basic level (faces are all unfamiliar with the most basic changes being only in expression) while the same expressions were presented alongside cars, houses and butterflies in Experiment 2 to facilitate subordinate processing (basic level: faces vs. objects; subordinate level: happy, neutral and sad expressions and cars, houses and butterflies). Experiment 1 found P1 and N170 modulations by happy, neutral and sad expressions that were not influenced by implicit or explicit processing condition with no such modulations in Experiment 2. Additionally, there were early modulations of ERPs related to expression in both experiments in the 30-80ms range with explicit processing mediating face and object differences found in the 30-80ms range for Experiment 2. The results of the current study support the Haxby, Hoffman, and Gobbini model where expression perception mechanisms can modulate early ERP components reflecting initial face perception and also show that this modulation depends on the presence or absence of comparison object stimuli. When comparison stimuli were not present (Experiment 1), expressions processed as a basic level stimulus category mainly influenced ERPs in the 140-400ms time range reflecting enhanced processing of the specific expression. When comparison object stimuli were present (Experiment 2), expressions processed as a subordinate stimulus category mainly influenced ERPs in the 30-140ms time range reflecting quicker categorization due to the presence of object stimuli rather than processing of the specific emotional expression.Item Open Access Face detection using correlation filters(Colorado State University. Libraries, 2013) Teli, Mohammad Nayeem, author; Beveridge, J. Ross, advisor; Draper, Bruce A., committee member; Howe, Adele, committee member; Givens, Geof H., committee memberCameras are ubiquitous and available all around us. As a result, images and videos are posted online in huge numbers. These images often need to be stored and analyzed. This requires the use of various computer vision applications that includes detection of human faces in these images and videos. The emphasis on face detection is evident from the applications found in everyday point and shoot cameras for a better focus, on social networking sites for tagging friends and family and for security situations which subsequently require face recognition or verification. This thesis focuses on detecting human faces in still images and video frames using correlation filters. These correlation filters are trained using a recent technique called Minimum Output Sum of Squared Error (MOSSE) developed by Bolme et al. Since correlation filters identify only a peak location, it only helps in localizing a single target point. In this thesis, I develop techniques to use this localization for detection of human faces of different scales and poses in uncontrolled background, location and lighting conditions. The goal of this research is to extend correlation filters for face detection and identify the scenarios where its potential is the most. The specific contributions of this work are the development of a novel face detector using correlation filters and the identification of the strengths and weaknesses of this approach. This approach is applied to an easy dataset and a hard dataset to emphasize the efficacy of correlations filters for face detection. This technique shows 95.6% accuracy in finding the exact location of the faces in images with controlled background and lighting. Although, the results on a hard dataset were not better than the OpenCV Viola and Jones face detector, it showed much better results, 81.5% detection rate compared to 69.43% detection rate by the Viola and Jones face detector, when tested on a customized dataset that was controlled for location change between training and test datasets. This result signifies the strength of a correlation based face detector in a specific scenario with uniform setting, such as a building entrance or an airport security gate.