AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Volkova, S.; Nguyen, D.; Penafiel, L.; Kao, H. -T.; Cohen, M.; Engberson, G.; Cassani, L.; Almutairi, M.; Chiang, C.; Banerjee, N.; Belcher, M.; Ford, T. W.; Yankoski, M. G.; Weninger, T.; Gomez-Zara, D.; Rebensky, S.
VirTLab: Augmented Intelligence for Modeling and Evaluating Human-AI Teaming Through Agent Interactions Proceedings Article
In: R.A., Sottilare; J., Schwarz (Ed.): Lect. Notes Comput. Sci., pp. 279–301, Springer Science and Business Media Deutschland GmbH, 2025, ISBN: 03029743 (ISSN); 978-303192969-4 (ISBN).
Abstract | Links | BibTeX | Tags: Agent based simulation, agent-based simulation, Augmented Reality, Causal analysis, HAT processes and states, Human digital twin, human digital twins, Human-AI team process and state, Human-AI teaming, Intelligent virtual agents, Operational readiness, Personnel training, Team performance, Team process, Virtual teaming, Visual analytics
@inproceedings{volkova_virtlab_2025,
title = {VirTLab: Augmented Intelligence for Modeling and Evaluating Human-AI Teaming Through Agent Interactions},
author = {S. Volkova and D. Nguyen and L. Penafiel and H. -T. Kao and M. Cohen and G. Engberson and L. Cassani and M. Almutairi and C. Chiang and N. Banerjee and M. Belcher and T. W. Ford and M. G. Yankoski and T. Weninger and D. Gomez-Zara and S. Rebensky},
editor = {Sottilare R.A. and Schwarz J.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105007830752&doi=10.1007%2f978-3-031-92970-0_20&partnerID=40&md5=c578dc95176a617f6de2a1c6f998f73f},
doi = {10.1007/978-3-031-92970-0_20},
isbn = {03029743 (ISSN); 978-303192969-4 (ISBN)},
year = {2025},
date = {2025-01-01},
booktitle = {Lect. Notes Comput. Sci.},
volume = {15813 LNCS},
pages = {279–301},
publisher = {Springer Science and Business Media Deutschland GmbH},
abstract = {This paper introduces VirTLab (Virtual Teaming Laboratory), a novel augmented intelligence platform designed to simulate and analyze interactions between human-AI teams (HATs) through the use of human digital twins (HDTs) and AI agents. VirTLab enhances operational readiness by systematically analyzing HAT dynamics, fostering trust development, and providing actionable recommendations to improve team performance outcomes. VirTLab combines agents driven by large language models (LLM) interacting in a simulated environment with integrated HAT performance measures obtained using interactive visual analytics. VirTLab integrates four key components: (1) HDTs with configurable profiles, (2) operational AI teammates, (3) a simulation engine that enforces temporal and spatial environment constraints, ensures situational awareness, and coordinates events between HDT and AI agents to deliver high-fidelity simulations, and (4) an evaluation platform that validates simulations against ground truth and enables exploration of how HDTs and AI attributes influence HAT functioning. We demonstrate VirTLab’s capabilities through focused experiments examining how variations in HDT openness, agreeableness, propensity to trust, and AI reliability and transparency influence HAT performance. Our HAT performance evaluation framework incorporates both objective measures such as communication patterns and mission completion, and subjective measures to include perceived trust and team coordination. Results on search and rescue missions reveal that AI teammate reliability significantly impacts communication dynamics and team assistance behaviors, whereas HDT personality traits influence trust development and team coordination -insights that directly inform the design of HAT training programs. VirTLab enables instructional designers to explore interventions in HAT behaviors through controlled experiments and causal analysis, leading to improved HAT performance. Visual analytics support the examination of HAT functioning across different conditions, allowing for real-time assessment and adaptation of scenarios. VirTLab contributes to operational readiness by preparing human operators to work seamlessly with AI counterparts in real-world situations. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.},
keywords = {Agent based simulation, agent-based simulation, Augmented Reality, Causal analysis, HAT processes and states, Human digital twin, human digital twins, Human-AI team process and state, Human-AI teaming, Intelligent virtual agents, Operational readiness, Personnel training, Team performance, Team process, Virtual teaming, Visual analytics},
pubstate = {published},
tppubtype = {inproceedings}
}
Kim, Y.; Aamir, Z.; Singh, M.; Boorboor, S.; Mueller, K.; Kaufman, A. E.
Explainable XR: Understanding User Behaviors of XR Environments Using LLM-Assisted Analytics Framework Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 31, no. 5, pp. 2756–2766, 2025, ISSN: 10772626 (ISSN).
Abstract | Links | BibTeX | Tags: adult, Agnostic, Article, Assistive, Cross Reality, Data Analytics, Data collection, data interpretation, Data recording, Data visualization, Extended reality, human, Language Model, Large language model, large language models, Multi-modal, Multimodal Data Collection, normal human, Personalized assistive technique, Personalized Assistive Techniques, recorder, Spatio-temporal data, therapy, user behavior, User behaviors, Virtual addresses, Virtual environments, Virtual Reality, Visual analytics, Visual languages
@article{kim_explainable_2025,
title = {Explainable XR: Understanding User Behaviors of XR Environments Using LLM-Assisted Analytics Framework},
author = {Y. Kim and Z. Aamir and M. Singh and S. Boorboor and K. Mueller and A. E. Kaufman},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105003815583&doi=10.1109%2fTVCG.2025.3549537&partnerID=40&md5=1085b698db06656985f80418cb37b773},
doi = {10.1109/TVCG.2025.3549537},
issn = {10772626 (ISSN)},
year = {2025},
date = {2025-01-01},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {31},
number = {5},
pages = {2756–2766},
abstract = {We present Explainable XR, an end-to-end framework for analyzing user behavior in diverse eXtended Reality (XR) environments by leveraging Large Language Models (LLMs) for data interpretation assistance. Existing XR user analytics frameworks face challenges in handling cross-virtuality - AR, VR, MR - transitions, multi-user collaborative application scenarios, and the complexity of multimodal data. Explainable XR addresses these challenges by providing a virtuality-agnostic solution for the collection, analysis, and visualization of immersive sessions. We propose three main components in our framework: (1) A novel user data recording schema, called User Action Descriptor (UAD), that can capture the users' multimodal actions, along with their intents and the contexts; (2) a platform-agnostic XR session recorder, and (3) a visual analytics interface that offers LLM-assisted insights tailored to the analysts' perspectives, facilitating the exploration and analysis of the recorded XR session data. We demonstrate the versatility of Explainable XR by demonstrating five use-case scenarios, in both individual and collaborative XR applications across virtualities. Our technical evaluation and user studies show that Explainable XR provides a highly usable analytics solution for understanding user actions and delivering multifaceted, actionable insights into user behaviors in immersive environments. © 1995-2012 IEEE.},
keywords = {adult, Agnostic, Article, Assistive, Cross Reality, Data Analytics, Data collection, data interpretation, Data recording, Data visualization, Extended reality, human, Language Model, Large language model, large language models, Multi-modal, Multimodal Data Collection, normal human, Personalized assistive technique, Personalized Assistive Techniques, recorder, Spatio-temporal data, therapy, user behavior, User behaviors, Virtual addresses, Virtual environments, Virtual Reality, Visual analytics, Visual languages},
pubstate = {published},
tppubtype = {article}
}
2024
Xi, M.; Perera, M.; Matthews, B.; Wang, R.; Weiley, V.; Somarathna, R.; Maqbool, H.; Chen, J.; Engelke, U.; Anderson, S.; Adcock, M.; Thomas, B. H.
Towards Immersive AI Proceedings Article
In: U., Eck; M., Sra; J., Stefanucci; M., Sugimoto; M., Tatzgern; I., Williams (Ed.): Proc. - IEEE Int. Symp. Mixed Augment. Real. Adjunct, ISMAR-Adjunct, pp. 260–264, Institute of Electrical and Electronics Engineers Inc., 2024, ISBN: 979-833150691-9 (ISBN).
Abstract | Links | BibTeX | Tags: Artificial intelligence, Augmented Reality, Data visualization, Decision making, Heterogenous data, Immersive, Immersive analytic, Immersive analytics, Industrial research, Mixed reality, Neuro-symbolic system, Real- time, Scientific paradigm, Situated imaging., Time-interleaved, Visual analytics, Work-flows
@inproceedings{xi_towards_2024,
title = {Towards Immersive AI},
author = {M. Xi and M. Perera and B. Matthews and R. Wang and V. Weiley and R. Somarathna and H. Maqbool and J. Chen and U. Engelke and S. Anderson and M. Adcock and B. H. Thomas},
editor = {Eck U. and Sra M. and Stefanucci J. and Sugimoto M. and Tatzgern M. and Williams I.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85214375967&doi=10.1109%2fISMAR-Adjunct64951.2024.00062&partnerID=40&md5=fd07c97119d71418bb4365582b1d188c},
doi = {10.1109/ISMAR-Adjunct64951.2024.00062},
isbn = {979-833150691-9 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Proc. - IEEE Int. Symp. Mixed Augment. Real. Adjunct, ISMAR-Adjunct},
pages = {260–264},
publisher = {Institute of Electrical and Electronics Engineers Inc.},
abstract = {With every shift in scientific paradigms comes not only a new way of seeing the world, but as Kunh argues, new tools for seeing [13]. Today, generative AI and neuro-symbolic systems show signs of changing how science functions, making it possible to synthesise complex heterogenous data in real time, interleaved with complex and situated workflows. But the new tools are not yet fully formed. To realise the opportunities and meet the challenges posed by the growth of generative AI for science and other knowledge work requires us to look beyond improvements in algorithms. The decision-making landscape for information workers has drastically changed, and the pressing need for analysts and experts to collaborate with AI in complex, high-tempo data environments has never been more evident.To bring strategic focus to these challenges in ways that will enable social, environmental and economic benefits for all, CSIRO's Data61 (the data and digital specialist arm of the Commonwealth Scientific and Industrial Research Organisation - Australia's national science agency) has established the Immersive AI Research Cluster. The cluster allows more than 30 research scientists and engineers to focus on defining a broad range of scientific disciplines for people to work with and understand the information provided by AI, such as data visualisation, visual analytics, connecting remote people, through immersive technologies like virtual and augmented reality. This workshop paper presents the trending research directions and challenges that emerged from this research cluster, which are closely linked to the scientific domains and illustrated through use cases. © 2024 IEEE.},
keywords = {Artificial intelligence, Augmented Reality, Data visualization, Decision making, Heterogenous data, Immersive, Immersive analytic, Immersive analytics, Industrial research, Mixed reality, Neuro-symbolic system, Real- time, Scientific paradigm, Situated imaging., Time-interleaved, Visual analytics, Work-flows},
pubstate = {published},
tppubtype = {inproceedings}
}