AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Li, J.; Neshaei, S. P.; Müller, L.; Rietsche, R.; Davis, R. L.; Wambsganss, T.
SpatiaLearn: Exploring XR Learning Environments for Reflective Writing Proceedings Article
In: Conf Hum Fact Comput Syst Proc, Association for Computing Machinery, 2025, ISBN: 979-840071395-8 (ISBN).
Abstract | Links | BibTeX | Tags: Adaptive Education, Conversational Agents, Conversational Tutoring, Critical thinking, Extended reality (XR), Immersive, Learning Environments, Metacognitive awareness, Reflective writing, Spatial computing
@inproceedings{li_spatialearn_2025,
title = {SpatiaLearn: Exploring XR Learning Environments for Reflective Writing},
author = {J. Li and S. P. Neshaei and L. Müller and R. Rietsche and R. L. Davis and T. Wambsganss},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105005757843&doi=10.1145%2f3706599.3719742&partnerID=40&md5=6e9ce83d3508cb377e209edd6884c505},
doi = {10.1145/3706599.3719742},
isbn = {979-840071395-8 (ISBN)},
year = {2025},
date = {2025-01-01},
booktitle = {Conf Hum Fact Comput Syst Proc},
publisher = {Association for Computing Machinery},
abstract = {Reflective writing promotes deeper learning by enhancing metacognitive awareness and critical thinking, but learners often struggle with structuring their reflections and maintaining focus. Generative AI and advances in spatial computing offer promising solutions. Extended reality (XR) environments create immersive, distraction-free settings, while conversational agents use dialog-based scaffolding guides to structure learners’ thoughts. However, research on combining dialog-based scaffolding with XR for reflective writing remains limited. To address this, we introduce SpatiaLearn, an adaptive XR tool that enhances reflective writing through conversational guidance in both traditional and immersive environments. A within-subjects study (N = 19) compared participants’ performance in traditional laptop and XR environments. Qualitative analysis shows the spatial interface enhances engagement but raises challenges like unfamiliar interactions and health concerns, requiring task adaptation for XR. This study advances the design of immersive tools for reflective writing, highlighting both the opportunities and challenges of spatial interfaces. © 2025 Copyright held by the owner/author(s).},
keywords = {Adaptive Education, Conversational Agents, Conversational Tutoring, Critical thinking, Extended reality (XR), Immersive, Learning Environments, Metacognitive awareness, Reflective writing, Spatial computing},
pubstate = {published},
tppubtype = {inproceedings}
}
2024
Nebeling, M.; Oki, M.; Gelsomini, M.; Hayes, G. R.; Billinghurst, M.; Suzuki, K.; Graf, R.
Designing Inclusive Future Augmented Realities Proceedings Article
In: Conf Hum Fact Comput Syst Proc, Association for Computing Machinery, 2024, ISBN: 979-840070331-7 (ISBN).
Abstract | Links | BibTeX | Tags: Accessible and inclusive design, Augmented Reality, Augmented reality technology, Display technologies, Generative AI, Inclusive design, Interactive computer graphics, Mixed reality, Mixed reality technologies, Rapid prototyping, Rapid-prototyping, Sensing technology, Spatial computing
@inproceedings{nebeling_designing_2024,
title = {Designing Inclusive Future Augmented Realities},
author = {M. Nebeling and M. Oki and M. Gelsomini and G. R. Hayes and M. Billinghurst and K. Suzuki and R. Graf},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85194176929&doi=10.1145%2f3613905.3636313&partnerID=40&md5=411b65058a4c96149182237aa586fa75},
doi = {10.1145/3613905.3636313},
isbn = {979-840070331-7 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Conf Hum Fact Comput Syst Proc},
publisher = {Association for Computing Machinery},
abstract = {Augmented and mixed reality technology is rapidly advancing, driven by innovations in display, sensing, and AI technologies. This evolution, particularly in the era of generative AI with large language and text-to-image models such as GPT and Stable Diffusion, has the potential, not only to make it easier to create, but also to adapt and personalize, new content. Our workshop explores the pivotal role of augmented and mixed reality to shape a user's interactions with their physical surroundings. We aim to explore how inclusive future augmented realities can be designed, with increasing support for automation, such that environments can welcome users with different needs, emphasizing accessibility and inclusion through layers of augmentations. Our aim is not only to remove barriers by providing accommodations, but also to create a sense of belonging by directly engaging users. Our workshop consists of three main activities: (1) Through brainstorming and discussion of examples provided by the workshop organizers and participants, we critically review the landscape of accessible and inclusive design and their vital role in augmented and mixed reality experiences. (2) Through rapid prototyping activities including bodystorming and low-fidelity, mixed-media prototypes, participants explore how augmented and mixed reality can transform physical space into a more personal place, enhancing accessibility and inclusion based on novel interface and interaction techniques that are desirable, but not necessarily technically feasible just yet. In the workshop, we plan to focus on physical space to facilitate rapid prototyping without technical constraints, but techniques developed in the workshop are likely applicable to immersive virtual environments as well. (3) Finally, we collaborate to outline a research agenda for designing future augmented realities that promote equal opportunities, benefiting diverse user populations. Our workshop inspires innovation in augmented and mixed reality, reshaping physical environments to be more accessible and inclusive through immersive design. © 2024 Owner/Author.},
keywords = {Accessible and inclusive design, Augmented Reality, Augmented reality technology, Display technologies, Generative AI, Inclusive design, Interactive computer graphics, Mixed reality, Mixed reality technologies, Rapid prototyping, Rapid-prototyping, Sensing technology, Spatial computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Tang, Y.; Situ, J.; Huang, Y.
Beyond User Experience: Technical and Contextual Metrics for Large Language Models in Extended Reality Proceedings Article
In: UbiComp Companion - Companion ACM Int. Jt. Conf. Pervasive Ubiquitous Comput., pp. 640–643, Association for Computing Machinery, Inc, 2024, ISBN: 979-840071058-2 (ISBN).
Abstract | Links | BibTeX | Tags: Augmented Reality, Computer simulation languages, Evaluation Metrics, Extended reality, Language Model, Large language model, large language models, Mixed reality, Modeling performance, Natural language processing systems, Physical world, Spatial computing, spatial data, user experience, Users' experiences, Virtual environments, Virtual Reality
@inproceedings{tang_beyond_2024,
title = {Beyond User Experience: Technical and Contextual Metrics for Large Language Models in Extended Reality},
author = {Y. Tang and J. Situ and Y. Huang},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85206203437&doi=10.1145%2f3675094.3678995&partnerID=40&md5=3fb337872b483a163bfbea038f1baffe},
doi = {10.1145/3675094.3678995},
isbn = {979-840071058-2 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {UbiComp Companion - Companion ACM Int. Jt. Conf. Pervasive Ubiquitous Comput.},
pages = {640–643},
publisher = {Association for Computing Machinery, Inc},
abstract = {Spatial Computing involves interacting with the physical world through spatial data manipulation, closely linked with Extended Reality (XR), which includes Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). Large Language Models (LLMs) significantly enhance XR applications by improving user interactions through natural language understanding and content generation. Typical evaluations of these applications focus on user experience (UX) metrics, such as task performance, user satisfaction, and psychological assessments, but often neglect the technical performance of the LLMs themselves. This paper identifies significant gaps in current evaluation practices for LLMs within XR environments, attributing them to the novelty of the field, the complexity of spatial contexts, and the multimodal nature of interactions in XR. To address these gaps, the paper proposes specific metrics tailored to evaluate LLM performance in XR contexts, including spatial contextual awareness, coherence, proactivity, multimodal integration, hallucination, and question-answering accuracy. These proposed metrics aim to complement existing UX evaluations, providing a comprehensive assessment framework that captures both the technical and user-centric aspects of LLM performance in XR applications. The conclusion underscores the necessity for a dual-focused approach that combines technical and UX metrics to ensure effective and user-friendly LLM-integrated XR systems. © 2024 Copyright held by the owner/author(s).},
keywords = {Augmented Reality, Computer simulation languages, Evaluation Metrics, Extended reality, Language Model, Large language model, large language models, Mixed reality, Modeling performance, Natural language processing systems, Physical world, Spatial computing, spatial data, user experience, Users' experiences, Virtual environments, Virtual Reality},
pubstate = {published},
tppubtype = {inproceedings}
}
Klein, A.; Arnowitz, E.
AI in mixed reality - Copilot on HoloLens: Spatial computing with large language models Proceedings Article
In: S.N., Spencer (Ed.): Proc. - SIGGRAPH Real-Time Live!, Association for Computing Machinery, Inc, 2024, ISBN: 979-840070526-7 (ISBN).
Abstract | Links | BibTeX | Tags: 3D, AI, AR, Gesture, Gestures, HoloLens, Language Model, LLM, Mixed reality, Real- time, Real-time, Spatial computing, User experience design, User interfaces, Voice
@inproceedings{klein_ai_2024,
title = {AI in mixed reality - Copilot on HoloLens: Spatial computing with large language models},
author = {A. Klein and E. Arnowitz},
editor = {Spencer S.N.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85200657459&doi=10.1145%2f3641520.3665305&partnerID=40&md5=07d385771b8813c1fafa0efb7ae7e9f2},
doi = {10.1145/3641520.3665305},
isbn = {979-840070526-7 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Proc. - SIGGRAPH Real-Time Live!},
publisher = {Association for Computing Machinery, Inc},
abstract = {Mixed reality together with AI presents a human-first interface that promises to transform operations. Copilot can assist industrial workers in real-time with speech and holograms; generative AI is used to search technical documentation, service records, training content, and other sources. Copilot then summarizes to provide interactive guidance. © 2024 Owner/Author.},
keywords = {3D, AI, AR, Gesture, Gestures, HoloLens, Language Model, LLM, Mixed reality, Real- time, Real-time, Spatial computing, User experience design, User interfaces, Voice},
pubstate = {published},
tppubtype = {inproceedings}
}