Exploring unimodal notification interaction and display methods in augmented reality
dc.contributor.author | Plabst, Lucas, author | |
dc.contributor.author | Raikwar, Aditya, author | |
dc.contributor.author | Oberdörfer, Sebastian, author | |
dc.contributor.author | Ortega, Francisco, author | |
dc.contributor.author | Niebling, Florian, author | |
dc.contributor.author | ACM, publisher | |
dc.date.accessioned | 2024-11-11T19:34:34Z | |
dc.date.available | 2024-11-11T19:34:34Z | |
dc.date.issued | 2023-10-09 | |
dc.description.abstract | As we develop computing platforms for augmented reality (AR) head-mounted display (HMDs) technologies for social or workplace environments, understanding how users interact with notifications in immersive environments has become crucial. We researched effectiveness and user preferences of different interaction modalities for notifications, along with two types of notification display methods. In our study, participants were immersed in a simulated cooking environment using an AR-HMD, where they had to fulfill customer orders. During the cooking process, participants received notifications related to customer orders and ingredient updates. They were given three interaction modes for those notifications: voice commands, eye gaze and dwell, and hand gestures. To manage multiple notifications at once, we also researched two different notification list displays, one attached to the user's hand and one in the world. Results indicate that participants preferred using their hands to interact with notifications and having the list of notifications attached to their hands. Voice and gaze interaction was perceived as having lower usability than touch. | |
dc.format.medium | born digital | |
dc.format.medium | articles | |
dc.identifier.bibliographicCitation | Lucas Plabst, Aditya Raikwar, Sebastian Oberdörfer, Francisco Ortega, and Florian Niebling. 2023. Exploring Unimodal Notification Interaction and Display Methods in Augmented Reality. In 29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023), October 09–11, 2023, Christchurch, New Zealand. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3611659.3615683 | |
dc.identifier.doi | https://doi.org/10.1145/3611659.3615683 | |
dc.identifier.uri | https://hdl.handle.net/10217/239537 | |
dc.language | English | |
dc.language.iso | eng | |
dc.publisher | Colorado State University. Libraries | |
dc.relation.ispartof | Publications | |
dc.relation.ispartof | ACM DL Digital Library | |
dc.rights | ©Lucas Plabst, et al. ACM 2023. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in VRST 2023, https://dx.doi.org/10.1145/3611659.3615683 . | |
dc.subject | augmented reality | |
dc.subject | interaction | |
dc.subject | eye gaze | |
dc.subject | voice commands | |
dc.subject | notifications | |
dc.subject | display methods | |
dc.title | Exploring unimodal notification interaction and display methods in augmented reality | |
dc.type | Text |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- FACF_ACMOA_3611659.3615683.pdf
- Size:
- 3.46 MB
- Format:
- Adobe Portable Document Format