AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Oliveira, E. A. Masasi De; Sousa, R. T.; Bastos, A. A.; Cintra, L. Martins De Freitas; Filho, A. R. G.
Immersive Virtual Museums with Spatially-Aware Retrieval-Augmented Generation Proceedings Article
In: IMX - Proc. ACM Int. Conf. Interact. Media Experiences, pp. 437–440, Association for Computing Machinery, Inc, 2025, ISBN: 979-840071391-0 (ISBN).
Abstract | Links | BibTeX | Tags: Association reactions, Behavioral Research, Generation systems, Geographics, Human computer interaction, Human engineering, Immersive, Information Retrieval, Interactive computer graphics, Language Model, Large language model, large language models, Museums, Retrieval-Augmented Generation, Search engines, Spatially aware, User interfaces, Virtual environments, Virtual museum, Virtual museum., Virtual Reality, Visual Attention, Visual languages
@inproceedings{masasi_de_oliveira_immersive_2025,
title = {Immersive Virtual Museums with Spatially-Aware Retrieval-Augmented Generation},
author = {E. A. Masasi De Oliveira and R. T. Sousa and A. A. Bastos and L. Martins De Freitas Cintra and A. R. G. Filho},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105007979183&doi=10.1145%2f3706370.3731643&partnerID=40&md5=db10b41217dd8a0b0705c3fb4a615666},
doi = {10.1145/3706370.3731643},
isbn = {979-840071391-0 (ISBN)},
year = {2025},
date = {2025-01-01},
booktitle = {IMX - Proc. ACM Int. Conf. Interact. Media Experiences},
pages = {437–440},
publisher = {Association for Computing Machinery, Inc},
abstract = {Virtual Reality has significantly expanded possibilities for immersive museum experiences, overcoming traditional constraints such as space, preservation, and geographic limitations. However, existing virtual museum platforms typically lack dynamic, personalized, and contextually accurate interactions. To address this, we propose Spatially-Aware Retrieval-Augmented Generation (SA-RAG), an innovative framework integrating visual attention tracking with Retrieval-Augmented Generation systems and advanced Large Language Models. By capturing users' visual attention in real time, SA-RAG dynamically retrieves contextually relevant data, enhancing the accuracy, personalization, and depth of user interactions within immersive virtual environments. The system's effectiveness is initially demonstrated through our preliminary tests within a realistic VR museum implemented using Unreal Engine. Although promising, comprehensive human evaluations involving broader user groups are planned for future studies to rigorously validate SA-RAG's effectiveness, educational enrichment potential, and accessibility improvements in virtual museums. The framework also presents opportunities for broader applications in immersive educational and storytelling domains. © 2025 Copyright held by the owner/author(s).},
keywords = {Association reactions, Behavioral Research, Generation systems, Geographics, Human computer interaction, Human engineering, Immersive, Information Retrieval, Interactive computer graphics, Language Model, Large language model, large language models, Museums, Retrieval-Augmented Generation, Search engines, Spatially aware, User interfaces, Virtual environments, Virtual museum, Virtual museum., Virtual Reality, Visual Attention, Visual languages},
pubstate = {published},
tppubtype = {inproceedings}
}
2024
Shen, J.; Yin, M.; Wang, W.; Hua, M.
Dwells in museum: The restorative potential of augmented reality Journal Article
In: Telematics and Informatics Reports, vol. 14, 2024, ISSN: 27725030 (ISSN).
Abstract | Links | BibTeX | Tags: Attention restoration, Augmented Reality, Museums, Restorative environment, Stress reduction
@article{shen_dwells_2024,
title = {Dwells in museum: The restorative potential of augmented reality},
author = {J. Shen and M. Yin and W. Wang and M. Hua},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85191314446&doi=10.1016%2fj.teler.2024.100136&partnerID=40&md5=18c6c77711dde30094e1afbd67163d06},
doi = {10.1016/j.teler.2024.100136},
issn = {27725030 (ISSN)},
year = {2024},
date = {2024-01-01},
journal = {Telematics and Informatics Reports},
volume = {14},
abstract = {Augmented Reality (AR) is increasingly recognized as a transformative tool for creating restorative environments within museums. It has the potential to provide psychological benefits for visitors, including attention restoration, stress reduction, and anxiety alleviation. This study explores how AR can foster these benefits within museum spaces. By adopting AR technology, museums can go beyond their traditional roles of knowledge dissemination. The immersive, adaptive, and interactive features of AR can enhance the museum experience, transforming it into an innovative therapeutic space. By combining real exhibits with virtual elements, AR can restore visitors’ psychological energy within museum settings. This integration of digital innovation into restorative contexts surpasses the traditional functions of visual service. Through empirical investigation of multiple dimensions of restorative environments, AR museum experiences offer comprehensive attention restoration. In this study, a survey was conducted with 279 participants to assess the impact of AR museum experiences on visitors’ psychology. The results revealed that such experiences contribute to heightened attention restoration levels, stress reduction, and anxiety relief. With the latest advancements in generative artificial intelligence, AR technology is empowered to integrate within museums. This integration will merge individuals with customized technology, expanding human perceptual experiences and highlighting AR's significant influence within the museum environment. © 2024},
keywords = {Attention restoration, Augmented Reality, Museums, Restorative environment, Stress reduction},
pubstate = {published},
tppubtype = {article}
}