Repository logo
 

Lights, headset, tablet, action: exploring the use of hybrid user interfaces for immersive situated analytics

dc.contributor.authorZhou, Xiaoyan, author
dc.contributor.authorLee, Benjamin, author
dc.contributor.authorOrtega, Francisco R., author
dc.contributor.authorBatmaz, Anil Ufuk, author
dc.contributor.authorYang, Yalong, author
dc.contributor.authorACM, publisher
dc.date.accessioned2025-03-13T18:31:28Z
dc.date.available2025-03-13T18:31:28Z
dc.date.issued2024-10-24
dc.description.abstractWhile augmented reality (AR) headsets provide entirely new ways of seeing and interacting with data, traditional computing devices can play a symbiotic role when used in conjunction with AR as a hybrid user interface. A promising use case for this setup is situated analytics. AR can provide embedded views that are integrated with their physical referents, and a separate device such as a tablet can provide a familiar situated overview of the entire dataset being examined. While prior work has explored similar setups, we sought to understand how people perceive and make use of visualizations presented on both embedded visualizations (in AR) and situated visualizations (on a tablet) to achieve their own goals. To this end, we conducted an exploratory study using a scenario and task familiar to most: adjusting light levels in a smart home based on personal preference and energy usage. In a prototype that simulates AR in virtual reality, embedded visualizations are positioned next to lights distributed across an apartment, and situated visualizations are provided on a handheld tablet. We observed and interviewed 19 participants using the prototype. Participants were easily able to perform the task, though the extent the visualizations were used during the task varied, with some making decisions based on the data and others only on their own preferences. Our findings also suggest the two distinct roles that situated and embedded visualizations can have, and how this clear separation might improve user satisfaction and minimize attention-switching overheads in this hybrid user interface setup. We conclude by discussing the importance of considering the user's needs, goals, and the physical environment for designing and evaluating effective situated analytics applications.
dc.format.mediumborn digital
dc.format.mediumarticles
dc.identifier.bibliographicCitationXiaoyan Zhou, Benjamin Lee, Francisco R. Ortega, Anil Ufuk Batmaz, and Yalong Yang. 2024. Lights, Headset, Tablet, Action: Exploring the Use of Hybrid User Interfaces for Immersive Situated Analytics. Proc. ACM Hum.-Comput. Interact. 8, ISS, Article 547 (December 2024), 23 pages. https://doi.org/10.1145/3698147
dc.identifier.doihttps://doi.org/10.1145/3698147
dc.identifier.urihttps://hdl.handle.net/10217/240171
dc.languageEnglish
dc.language.isoeng
dc.publisherColorado State University. Libraries
dc.relation.ispartofPublications
dc.relation.ispartofACM DL Digital Library
dc.rights©Xiaoyan Zhou, et al. ACM 2024. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in Proceedings of the ACM on Human-Computer Interaction, Volume 8, Issue ISS (December 2024), https://dx.doi.org/10.1145/3698147.
dc.subjectvirtual and augmented reality
dc.subjectvisualization
dc.subjectsituated analytics
dc.subjecthybrid user interfaces
dc.subjectimmersive analytics
dc.titleLights, headset, tablet, action: exploring the use of hybrid user interfaces for immersive situated analytics
dc.typeText

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
FACF_ACMOA_3698147.pdf
Size:
5.48 MB
Format:
Adobe Portable Document Format

Collections