Repository logo
 

Exploring unimodal notification interaction and display methods in augmented reality

dc.contributor.authorPlabst, Lucas, author
dc.contributor.authorRaikwar, Aditya, author
dc.contributor.authorOberdörfer, Sebastian, author
dc.contributor.authorOrtega, Francisco, author
dc.contributor.authorNiebling, Florian, author
dc.contributor.authorACM, publisher
dc.date.accessioned2024-11-11T19:34:34Z
dc.date.available2024-11-11T19:34:34Z
dc.date.issued2023-10-09
dc.description.abstractAs we develop computing platforms for augmented reality (AR) head-mounted display (HMDs) technologies for social or workplace environments, understanding how users interact with notifications in immersive environments has become crucial. We researched effectiveness and user preferences of different interaction modalities for notifications, along with two types of notification display methods. In our study, participants were immersed in a simulated cooking environment using an AR-HMD, where they had to fulfill customer orders. During the cooking process, participants received notifications related to customer orders and ingredient updates. They were given three interaction modes for those notifications: voice commands, eye gaze and dwell, and hand gestures. To manage multiple notifications at once, we also researched two different notification list displays, one attached to the user's hand and one in the world. Results indicate that participants preferred using their hands to interact with notifications and having the list of notifications attached to their hands. Voice and gaze interaction was perceived as having lower usability than touch.
dc.format.mediumborn digital
dc.format.mediumarticles
dc.identifier.bibliographicCitationLucas Plabst, Aditya Raikwar, Sebastian Oberdörfer, Francisco Ortega, and Florian Niebling. 2023. Exploring Unimodal Notification Interaction and Display Methods in Augmented Reality. In 29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023), October 09–11, 2023, Christchurch, New Zealand. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3611659.3615683
dc.identifier.doihttps://doi.org/10.1145/3611659.3615683
dc.identifier.urihttps://hdl.handle.net/10217/239537
dc.languageEnglish
dc.language.isoeng
dc.publisherColorado State University. Libraries
dc.relation.ispartofPublications
dc.relation.ispartofACM DL Digital Library
dc.rights©Lucas Plabst, et al. ACM 2023. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in VRST 2023, https://dx.doi.org/10.1145/3611659.3615683 .
dc.subjectaugmented reality
dc.subjectinteraction
dc.subjecteye gaze
dc.subjectvoice commands
dc.subjectnotifications
dc.subjectdisplay methods
dc.titleExploring unimodal notification interaction and display methods in augmented reality
dc.typeText

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
FACF_ACMOA_3611659.3615683.pdf
Size:
3.46 MB
Format:
Adobe Portable Document Format

Collections