Interaction and navigation in cross-reality analytics
dc.contributor.author | Zhou, Xiaoyan, author | |
dc.contributor.author | Ortega, Francisco, advisor | |
dc.contributor.author | Ray, Indrakshi, committee member | |
dc.contributor.author | Moraes, Marcia, committee member | |
dc.contributor.author | Batmaz, Anil Ufuk, committee member | |
dc.contributor.author | Malinin, Laura, committee member | |
dc.date.accessioned | 2024-09-09T20:52:04Z | |
dc.date.available | 2025-08-16 | |
dc.date.issued | 2024 | |
dc.description.abstract | Along with immersive display technology's fast evolution, augmented reality (AR) and virtual reality (VR) are increasingly being researched to facilitate data analytics, known as Immersive Analytics. The ability to interact with data visualization in the space around users not only builds the foundation of ubiquitous analytics but also assists users in the sensemaking of the data. However, interaction and navigation while making sense of 3D data visualization in different realities still need to be better understood and explored. For example, what are the differences between users interacting in augmented and virtual reality, and how can we utilize them in the best way during analysis tasks? Moreover, based on the existing work and our preliminary studies, improving the interaction efficiency with immersive displays still needs to be solved. Therefore, this thesis focuses on understanding interaction and navigation in augmented reality and virtual reality for immersive analytics. First, we explored how users interact with multiple objects in augmented reality by using the "Wizard of Oz" study approach. We elicited multimodal interactions involving hand gestures and speech, with text prompts shown on the head-mounted display. Then, we compared the results with previous work in a single-object scenario, which helped us better understand how users prefer to interact in a more complex AR environment. Second, we built an immersive analytics platform in both AR and VR environments to simulate a realistic scenario and conducted a controlled study to evaluate user performance with designed analysis tools and 3D data visualization. Based on the results, interaction and navigation patterns were observed and analyzed for a better understanding of user preferences during the sensemaking process. ii Lastly, by considering the findings and insights from prior studies, we developed a hybrid user interface in simulated cross-reality for situated analytics. An exploratory study was conducted with a smart home setting to understand user interaction and navigation in a more familiar scenario with practical tasks. With the results, we did a thorough qualitative analysis of feedback and video recording to disclose user preferences with interaction and visualization in situated analytics in the everyday decision-making scenario. In conclusion, this thesis uncovered user-designed multimodal interaction including mid-air hand gestures and speech for AR, users' interaction and navigation strategies in immersive analytics in both AR and VR, and hybrid user interface usage in situated analytics for assisting decision-making. Our findings and insights in this thesis provide guidelines and inspiration for future research in interaction and navigation design and improving user experience with analytics in mixed-reality environments. | |
dc.format.medium | born digital | |
dc.format.medium | doctoral dissertations | |
dc.identifier | Zhou_colostate_0053A_18416.pdf | |
dc.identifier.uri | https://hdl.handle.net/10217/239220 | |
dc.language | English | |
dc.language.iso | eng | |
dc.publisher | Colorado State University. Libraries | |
dc.relation.ispartof | 2020- | |
dc.rights | Copyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright. | |
dc.rights.access | Embargo expires: 08/16/2025. | |
dc.subject | cross reality | |
dc.subject | immersive analytics | |
dc.subject | virtual reality | |
dc.subject | hybrid user interface | |
dc.subject | augmented reality | |
dc.subject | situated analytics | |
dc.title | Interaction and navigation in cross-reality analytics | |
dc.type | Text | |
dcterms.embargo.expires | 2025-08-16 | |
dcterms.embargo.terms | 2025-08-16 | |
dcterms.rights.dpla | This Item is protected by copyright and/or related rights (https://rightsstatements.org/vocab/InC/1.0/). You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s). | |
thesis.degree.discipline | Computer Science | |
thesis.degree.grantor | Colorado State University | |
thesis.degree.level | Doctoral | |
thesis.degree.name | Doctor of Philosophy (Ph.D.) |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- Zhou_colostate_0053A_18416_rev.pdf
- Size:
- 4.04 MB
- Format:
- Adobe Portable Document Format