AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2024
Stuart, J.; Stephen, A.; Aul, K.; Bumbach, M. D.; Huffman, S.; Russo, B.; Lok, B.
Developing augmented reality filters to display visual cues on diverse skin tones Journal Article
In: Frontiers in Virtual Reality, vol. 5, 2024, ISSN: 26734192 (ISSN).
Abstract | Links | BibTeX | Tags: Augmented Reality, fidelity, Healthcare, realism, simulation, symptoms, visual cue training
@article{stuart_developing_2024,
title = {Developing augmented reality filters to display visual cues on diverse skin tones},
author = {J. Stuart and A. Stephen and K. Aul and M. D. Bumbach and S. Huffman and B. Russo and B. Lok},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85198640398&doi=10.3389%2ffrvir.2024.1363193&partnerID=40&md5=33470de917a5e9979f77fd42f25614eb},
doi = {10.3389/frvir.2024.1363193},
issn = {26734192 (ISSN)},
year = {2024},
date = {2024-01-01},
journal = {Frontiers in Virtual Reality},
volume = {5},
abstract = {Introduction: Variations in skin tone can significantly alter the appearance of symptoms such as rashes or bruises. Unfortunately, previous works utilizing Augmented Reality (AR) in simulating visual symptoms have often failed to consider this critical aspect, potentially leading to inadequate training and education. This study seeks to address this gap by integrating generative artificial intelligence (AI) into the AR filter design process. Methods: We conducted a 2 × 5 within-subjects study with second-year nursing students (N = 117) from the University of Florida. The study manipulated two factors: symptom generation style and skin tone. Symptom generation style was manipulated using a filter based on a real symptom image or a filter based on a computer-generated symptom image. Skin tone variations were created by applying AR filters to computer-generated images of faces with five skin tones ranging from light to dark. To control for factors like lighting or 3D tracking, 101 pre-generated images were created for each condition, representing a range of filter transparency levels (0–100). Participants used visual analog scales on a computer screen to adjust the symptom transparency in the images until they observed image changes and distinct symptom patterns. Participants also rated the realism of each condition and provided feedback on how the symptom style and skin tone impacted their perceptions. Results: Students rated the symptoms displayed by the computer-generated AR filters as marginally more realistic than those displayed by the real image AR filters. However, students identified symptoms earlier with the real-image filters. Additionally, SET-M and Theory of Planned Behavior questions indicate that the activity increased students’ feelings of confidence and self-efficacy. Finally, we found that similar to the real world, where symptoms on dark skin tones are identified at later stages of development, students identified symptoms at later stages as skin tone darkened regardless of cue type. Conclusion: This work implemented a novel approach to develop AR filters that display time-based visual cues on diverse skin tones. Additionally, this work provides evidence-based recommendations on how and when generative AI-based AR filters can be effectively used in healthcare education. Copyright © 2024 Stuart, Stephen, Aul, Bumbach, Huffman, Russo and Lok.},
keywords = {Augmented Reality, fidelity, Healthcare, realism, simulation, symptoms, visual cue training},
pubstate = {published},
tppubtype = {article}
}
Gaudi, T.; Kapralos, B.; Quevedo, A.
Structural and Functional Fidelity of Virtual Humans in Immersive Virtual Learning Environments Proceedings Article
In: IEEE Gaming, Entertain., Media Conf., GEM, Institute of Electrical and Electronics Engineers Inc., 2024, ISBN: 979-835037453-7 (ISBN).
Abstract | Links | BibTeX | Tags: 3D modeling, Computer aided instruction, Digital representations, E-Learning, Engagement, fidelity, Immersive, Immersive virtual learning environment, Serious game, Serious games, Three dimensional computer graphics, Virtual character, virtual human, Virtual humans, Virtual instructors, Virtual learning environments, Virtual Reality, virtual simulation, Virtual simulations
@inproceedings{gaudi_structural_2024,
title = {Structural and Functional Fidelity of Virtual Humans in Immersive Virtual Learning Environments},
author = {T. Gaudi and B. Kapralos and A. Quevedo},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85199517136&doi=10.1109%2fGEM61861.2024.10585535&partnerID=40&md5=bf271019e077b5e464bcd62b1b28312b},
doi = {10.1109/GEM61861.2024.10585535},
isbn = {979-835037453-7 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {IEEE Gaming, Entertain., Media Conf., GEM},
publisher = {Institute of Electrical and Electronics Engineers Inc.},
abstract = {Central to many immersive virtual learning environments (iVLEs) are virtual humans, or characters that are digital representations, which can serve as virtual instructors to facilitate learning. Current technology is allowing the production of photo-realistic (high fidelity/highly realistic) avatars, whether using traditional approaches relying on 3D modeling, or modern tools leveraging generative AI and virtual character creation tools. However, fidelity (i.e., level of realism) is complex as it can be analyzed from various points of view referring to its structure, function, interactivity, and behavior among others. Given its relevance, fidelity can influence various aspects of iVLEs including engagement and ultimately learning outcomes. In this work-in-progress paper, we propose a study that will examine the effect of structural and functional fidelity of a virtual human assistant on engagement within a virtual simulation designed to teach the cognitive aspects (e.g., the steps of a procedure) of the heart auscultation procedure. © 2024 IEEE.},
keywords = {3D modeling, Computer aided instruction, Digital representations, E-Learning, Engagement, fidelity, Immersive, Immersive virtual learning environment, Serious game, Serious games, Three dimensional computer graphics, Virtual character, virtual human, Virtual humans, Virtual instructors, Virtual learning environments, Virtual Reality, virtual simulation, Virtual simulations},
pubstate = {published},
tppubtype = {inproceedings}
}