Browsing by Author "Plabst, Lucas, author"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Exploring unimodal notification interaction and display methods in augmented reality(Colorado State University. Libraries, 2023-10-09) Plabst, Lucas, author; Raikwar, Aditya, author; Oberdörfer, Sebastian, author; Ortega, Francisco, author; Niebling, Florian, author; ACM, publisherAs we develop computing platforms for augmented reality (AR) head-mounted display (HMDs) technologies for social or workplace environments, understanding how users interact with notifications in immersive environments has become crucial. We researched effectiveness and user preferences of different interaction modalities for notifications, along with two types of notification display methods. In our study, participants were immersed in a simulated cooking environment using an AR-HMD, where they had to fulfill customer orders. During the cooking process, participants received notifications related to customer orders and ingredient updates. They were given three interaction modes for those notifications: voice commands, eye gaze and dwell, and hand gestures. To manage multiple notifications at once, we also researched two different notification list displays, one attached to the user's hand and one in the world. Results indicate that participants preferred using their hands to interact with notifications and having the list of notifications attached to their hands. Voice and gaze interaction was perceived as having lower usability than touch.Item Open Access Segmentation and immersive visualization of brain lesions using deep learning and virtual reality(Colorado State University. Libraries, 2025-01-19) Kelley, Brendan, author; Plabst, Lucas, author; Plabst, Lena, author; ACM, publisherMagnetic resonance imaging (MRIs) are commonly used for diagnosing potential neurological disorders, however preparation and interpretation of MRI scans requires professional oversight. Additionally, MRIs are typically viewed as single cross sections of the affected regions which does not always capture the full picture of brain lesions and can be difficult to understand due to 2D's inherent abstraction of our 3D world. To address these challenges we propose a immersive visualization pipeline that combines deep learning image segmentation techniques using a VGG-16 model trained on MRI fluid attenuated inversion recovery (FLAIR) with virtual reality (VR) immersive analytics. Our visualization pipeline begins with our VGG-16 model predicting which regions of the brain are potentially affected by a disease. This output, along with the original scan, are then volumentrically rendered. These renders can then be viewed in VR using an head mounted display (HMD). Within the HMD users can move through the volumentric renderings to view the affected regions and utilize planes to view cross sections of the MRI scans. Our work provides a potential pipeline and tool for diagnosis and care.