AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Lv, J.; Slowik, A.; Rani, S.; Kim, B. -G.; Chen, C. -M.; Kumari, S.; Li, K.; Lyu, X.; Jiang, H.
In: Research, vol. 8, 2025, ISSN: 20965168 (ISSN).
Abstract | Links | BibTeX | Tags: Adaptive fusion, Collaborative representations, Diagnosis, Electronic health record, Generative adversarial networks, Health care application, Healthcare environments, Immersive, Learning frameworks, Metaverses, Multi-modal, Multi-modal learning, Performance
@article{lv_multimodal_2025,
title = {Multimodal Metaverse Healthcare: A Collaborative Representation and Adaptive Fusion Approach for Generative Artificial-Intelligence-Driven Diagnosis},
author = {J. Lv and A. Slowik and S. Rani and B. -G. Kim and C. -M. Chen and S. Kumari and K. Li and X. Lyu and H. Jiang},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-86000613924&doi=10.34133%2fresearch.0616&partnerID=40&md5=fdc8ae3b29db905105dada9a5657b54b},
doi = {10.34133/research.0616},
issn = {20965168 (ISSN)},
year = {2025},
date = {2025-01-01},
journal = {Research},
volume = {8},
abstract = {The metaverse enables immersive virtual healthcare environments, presenting opportunities for enhanced care delivery. A key challenge lies in effectively combining multimodal healthcare data and generative artificial intelligence abilities within metaverse-based healthcare applications, which is a problem that needs to be addressed. This paper proposes a novel multimodal learning framework for metaverse healthcare, MMLMH, based on collaborative intra- and intersample representation and adaptive fusion. Our framework introduces a collaborative representation learning approach that captures shared and modality-specific features across text, audio, and visual health data. By combining modality-specific and shared encoders with carefully formulated intrasample and intersample collaboration mechanisms, MMLMH achieves superior feature representation for complex health assessments. The framework’s adaptive fusion approach, utilizing attention mechanisms and gated neural networks, demonstrates robust performance across varying noise levels and data quality conditions. Experiments on metaverse healthcare datasets demonstrate MMLMH’s superior performance over baseline methods across multiple evaluation metrics. Longitudinal studies and visualization further illustrate MMLMH’s adaptability to evolving virtual environments and balanced performance across diagnostic accuracy, patient–system interaction efficacy, and data integration complexity. The proposed framework has a unique advantage in that a similar level of performance is maintained across various patient populations and virtual avatars, which could lead to greater personalization of healthcare experiences in the metaverse. MMLMH’s successful functioning in such complicated circumstances suggests that it can combine and process information streams from several sources. They can be successfully utilized in next-generation healthcare delivery through virtual reality. © 2025 Jianhui Lv et al.},
keywords = {Adaptive fusion, Collaborative representations, Diagnosis, Electronic health record, Generative adversarial networks, Health care application, Healthcare environments, Immersive, Learning frameworks, Metaverses, Multi-modal, Multi-modal learning, Performance},
pubstate = {published},
tppubtype = {article}
}
Logothetis, I.; Diakogiannis, K.; Vidakis, N.
Interactive Learning Through Conversational Avatars and Immersive VR: Enhancing Diabetes Education and Self-Management Proceedings Article
In: X., Fang (Ed.): Lect. Notes Comput. Sci., pp. 415–429, Springer Science and Business Media Deutschland GmbH, 2025, ISBN: 03029743 (ISSN); 978-303192577-1 (ISBN).
Abstract | Links | BibTeX | Tags: Artificial intelligence, Chronic disease, Computer aided instruction, Diabetes Education, Diagnosis, E-Learning, Education management, Engineering education, Gamification, Immersive virtual reality, Interactive computer graphics, Interactive learning, Large population, Learning systems, NUI, Self management, Serious game, Serious games, simulation, Virtual Reality
@inproceedings{logothetis_interactive_2025,
title = {Interactive Learning Through Conversational Avatars and Immersive VR: Enhancing Diabetes Education and Self-Management},
author = {I. Logothetis and K. Diakogiannis and N. Vidakis},
editor = {Fang X.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105008266480&doi=10.1007%2f978-3-031-92578-8_27&partnerID=40&md5=451274dfa3ef0b3f1b39c7d5a665ee3b},
doi = {10.1007/978-3-031-92578-8_27},
isbn = {03029743 (ISSN); 978-303192577-1 (ISBN)},
year = {2025},
date = {2025-01-01},
booktitle = {Lect. Notes Comput. Sci.},
volume = {15816 LNCS},
pages = {415–429},
publisher = {Springer Science and Business Media Deutschland GmbH},
abstract = {Diabetes is a chronic disease affecting a large population of the world. Education and self-management of diabetes are crucial. Technologies such as Virtual Reality (VR) have presented promising results in healthcare education, while studies suggest that Artificial Intelligence (AI) can help in learning by further engaging the learner. This study aims to educate users on the entire routine of managing diabetes. The serious game utilizes VR for realistic interaction with diabetes tools and generative AI through a conversational avatar that acts as an assistant instructor. In this way, it allows users to practice diagnostic and therapeutic interventions in a controlled virtual environment, helping to build their understanding and confidence in diabetes management. To measure the effects of the proposed serious game, presence, and perceived agency were measured. Preliminary results indicate that this setup aids in the engagement and immersion of learners, while the avatar can provide helpful information during gameplay. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.},
keywords = {Artificial intelligence, Chronic disease, Computer aided instruction, Diabetes Education, Diagnosis, E-Learning, Education management, Engineering education, Gamification, Immersive virtual reality, Interactive computer graphics, Interactive learning, Large population, Learning systems, NUI, Self management, Serious game, Serious games, simulation, Virtual Reality},
pubstate = {published},
tppubtype = {inproceedings}
}
2024
Takata, T.; Yamada, R.; Rene, A. Oliveira Nzinga; Xu, K.; Fujimoto, M.
Development of a Virtual Patient Model for Kampo Medical Interview: New Approach for Enhancing Empathy and Understanding of Kampo Medicine Pathological Concepts Proceedings Article
In: Jt. Int. Conf. Soft Comput. Intell. Syst. Int. Symp. Adv. Intell. Syst., SCIS ISIS, Institute of Electrical and Electronics Engineers Inc., 2024, ISBN: 979-835037333-2 (ISBN).
Abstract | Links | BibTeX | Tags: Artificial intelligence, Clinical practices, Clinical training, Complementary and alternative medicines, Covid-19, Diagnosis, Educational approach, Empathy, Kampo medical interview, Medical education, Medical student, Medical students, New approaches, Virtual environments, Virtual patient, Virtual patient models, Virtual patients, Virtual Reality
@inproceedings{takata_development_2024,
title = {Development of a Virtual Patient Model for Kampo Medical Interview: New Approach for Enhancing Empathy and Understanding of Kampo Medicine Pathological Concepts},
author = {T. Takata and R. Yamada and A. Oliveira Nzinga Rene and K. Xu and M. Fujimoto},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85214666311&doi=10.1109%2fSCISISIS61014.2024.10759962&partnerID=40&md5=2e149e0fe211f586049914e571c6e2fa},
doi = {10.1109/SCISISIS61014.2024.10759962},
isbn = {979-835037333-2 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Jt. Int. Conf. Soft Comput. Intell. Syst. Int. Symp. Adv. Intell. Syst., SCIS ISIS},
publisher = {Institute of Electrical and Electronics Engineers Inc.},
abstract = {Global interest in complementary and alternative medicine has increased in recent years, with Kampo medicine in Japan gaining greater trust and use. Detailed patient interviews are essential in Kampo medicine, as the physician's empathy is critical to diagnostic precision. Typically, medical students develop empathy and deepen their understanding of Kampo's pathological concepts through clinical practice. However, the COVID-19 pandemic has imposed significant restrictions on clinical training. To address this challenge, we propose a novel educational approach to enhance empathy and understanding of Kampo medicine by developing a virtual patient application. This application leverages generative artificial intelligence to simulate realistic patient interactions, enabling students to practice Kampo medical interviews in a safe, controlled environment. The AI-generated conversations are designed to reflect the emotional nuances of real-life dialogue, with the virtual patients' facial expressions synchronized to these emotions, thus enhancing the realism of the training. The suggested method allows repeated practice at any time and fosters the development of essential diag-nostic and empathetic skills. While promising challenges remain in improving these simulations' accuracy, further refinements are still under consideration. © 2024 IEEE.},
keywords = {Artificial intelligence, Clinical practices, Clinical training, Complementary and alternative medicines, Covid-19, Diagnosis, Educational approach, Empathy, Kampo medical interview, Medical education, Medical student, Medical students, New approaches, Virtual environments, Virtual patient, Virtual patient models, Virtual patients, Virtual Reality},
pubstate = {published},
tppubtype = {inproceedings}
}
Samson, J.; Lameras, P.; Taylor, N.; Kneafsey, R.
Fostering a Co-creation Process for the Development of an Extended Reality Healthcare Education Resource Proceedings Article
In: M.E., Auer; T., Tsiatsos (Ed.): Lect. Notes Networks Syst., pp. 205–212, Springer Science and Business Media Deutschland GmbH, 2024, ISBN: 23673370 (ISSN); 978-303156074-3 (ISBN).
Abstract | Links | BibTeX | Tags: Artificial intelligence, Co-creation, Creation process, Diagnosis, Education computing, Education resource, Extended reality, Health care education, Hospitals, Immersive, Inter professionals, Interprofessional Healthcare Education, Software products, Students, Virtual patients
@inproceedings{samson_fostering_2024,
title = {Fostering a Co-creation Process for the Development of an Extended Reality Healthcare Education Resource},
author = {J. Samson and P. Lameras and N. Taylor and R. Kneafsey},
editor = {Auer M.E. and Tsiatsos T.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85189759614&doi=10.1007%2f978-3-031-56075-0_20&partnerID=40&md5=6ae832882a2e224094c1beb81c925333},
doi = {10.1007/978-3-031-56075-0_20},
isbn = {23673370 (ISSN); 978-303156074-3 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Lect. Notes Networks Syst.},
volume = {937 LNNS},
pages = {205–212},
publisher = {Springer Science and Business Media Deutschland GmbH},
abstract = {The aim of this research is to create an immersive healthcare education resource using an extended reality (XR) platform. This platform leverages an existing software product, incorporating virtual patients with conversational capabilities driven by artificial intelligence (AI). The initial stage produced an early prototype focused on assessing an elderly virtual patient experiencing frailty. This scenario encompasses the hospital admission to post-discharge care at home, involving various healthcare professionals such as paramedics, emergency clinicians, diagnostic radiographers, geriatricians, physiotherapists, occupational therapists, nurses, operating department practitioners, dietitians, and social workers. The plan moving forward is to refine and expand this prototype through a co-creation with diverse stakeholders. The refinement process will include the introduction of updated scripts into the standard AI model. Furthermore, these scripts will be tested against a new hybrid model that combines generative AI. Ultimately, this resource will be co-designed to create a learning activity tailored for occupational therapy and physiotherapy students. This activity will undergo testing with a cohort of students, and the outcomes of this research are expected to inform the future development of interprofessional virtual simulated placements (VSPs). These placements will complement traditional clinical learning experiences, offering students an immersive environment to enhance their skills and knowledge in the healthcare field. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.},
keywords = {Artificial intelligence, Co-creation, Creation process, Diagnosis, Education computing, Education resource, Extended reality, Health care education, Hospitals, Immersive, Inter professionals, Interprofessional Healthcare Education, Software products, Students, Virtual patients},
pubstate = {published},
tppubtype = {inproceedings}
}
Saddik, A. E.; Ghaboura, S.
The Integration of ChatGPT With the Metaverse for Medical Consultations Journal Article
In: IEEE Consumer Electronics Magazine, vol. 13, no. 3, pp. 6–15, 2024, ISSN: 21622248 (ISSN).
Abstract | Links | BibTeX | Tags: Chatbots, Computational Linguistics, Cutting edges, Diagnosis, Health care, Healthcare delivery, Healthcare environments, Human like, Immersive, Language Model, Medical diagnostic imaging, Medical Imaging, Medical services, Metaverses
@article{saddik_integration_2024,
title = {The Integration of ChatGPT With the Metaverse for Medical Consultations},
author = {A. E. Saddik and S. Ghaboura},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85174844304&doi=10.1109%2fMCE.2023.3324978&partnerID=40&md5=ce0da4988d06258a1bc695e2d4ac4677},
doi = {10.1109/MCE.2023.3324978},
issn = {21622248 (ISSN)},
year = {2024},
date = {2024-01-01},
journal = {IEEE Consumer Electronics Magazine},
volume = {13},
number = {3},
pages = {6–15},
abstract = {Recent years witnessed a promising synergy between healthcare and the Metaverse leading to the development of virtual healthcare environments. This convergence offers accessible and immersive healthcare experiences and holds the potential for transforming the delivery of medical services and enhancing patient outcomes. However, the reliance on specialist presence in the metaverse for medical support remains a challenge. On the other hand, the newly launched large language model chatbot, the ChatGPT of OpenAI, has emerged as a game-changer, providing human-like responses and facilitating interactive conversations. By integrating this cutting-edge language model with the Metaverse for medical purposes, we can potentially revolutionize healthcare delivery, enhance access to care, and increase patient engagement. This study proposes a new medical Metaverse model utilizing GPT-4 as a content creator, highlighting its potential, addressing challenges and limitations, and exploring various application fields. We conclude by outlining our ongoing efforts to transform this concept into a practical reality. © 2012 IEEE.},
keywords = {Chatbots, Computational Linguistics, Cutting edges, Diagnosis, Health care, Healthcare delivery, Healthcare environments, Human like, Immersive, Language Model, Medical diagnostic imaging, Medical Imaging, Medical services, Metaverses},
pubstate = {published},
tppubtype = {article}
}
Diaz, T. G.; Lee, X. Y.; Zhuge, H.; Vidyaratne, L.; Sin, G.; Watanabe, T.; Farahat, A.; Gupta, C.
AI+AR based Framework for Guided Visual Equipment Diagnosis Proceedings Article
In: C.S., Kulkarni; M.E., Orchard (Ed.): Proc. Annu. Conf. Progn. Health Manag. Soc., PHM, Prognostics and Health Management Society, 2024, ISBN: 23250178 (ISSN); 978-193626305-9 (ISBN).
Abstract | Links | BibTeX | Tags: Augmented Reality, Automated solutions, Customer loyalty, Customer satisfaction, Customers' satisfaction, Diagnosis, Equipment diagnosis, Failure Diagnosis, Failure repairs, High quality, Knowledge graphs, Language Model, Quality of Service, Query languages, Sales, Support services
@inproceedings{diaz_aiar_2024,
title = {AI+AR based Framework for Guided Visual Equipment Diagnosis},
author = {T. G. Diaz and X. Y. Lee and H. Zhuge and L. Vidyaratne and G. Sin and T. Watanabe and A. Farahat and C. Gupta},
editor = {Kulkarni C.S. and Orchard M.E.},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85210227167&doi=10.36001%2fphmconf.2024.v16i1.3909&partnerID=40&md5=897ac8045a48e2e80aa7522870c2004f},
doi = {10.36001/phmconf.2024.v16i1.3909},
isbn = {23250178 (ISSN); 978-193626305-9 (ISBN)},
year = {2024},
date = {2024-01-01},
booktitle = {Proc. Annu. Conf. Progn. Health Manag. Soc., PHM},
volume = {16},
publisher = {Prognostics and Health Management Society},
abstract = {Automated solutions for effective support services, such as failure diagnosis and repair, are crucial to keep customer satisfaction and loyalty. However, providing consistent, high quality, and timely support is a difficult task. In practice, customer support usually requires technicians to perform onsite diagnosis, but service quality is often adversely affected by limited expert technicians, high turnover, and minimal automated tools. To address these challenges, we present a novel solution framework for aiding technicians in performing visual equipment diagnosis. We envision a workflow where the technician reports a failure and prompts the system to automatically generate a diagnostic plan that includes parts, areas of interest, and necessary tasks. The plan is used to guide the technician with augmented reality (AR), while a perception module analyzes and tracks the technician’s actions to recommend next steps. Our framework consists of three components: planning, tracking, and guiding. The planning component automates the creation of a diagnostic plan by querying a knowledge graph (KG). We propose to leverage Large Language Models (LLMs) for the construction of the KG to accelerate the extraction process of parts, tasks, and relations from manuals. The tracking component enhances 3D detections by using perception sensors with a 2D nested object detection model. Finally, the guiding component reduces process complexity for technicians by combining 2D models and AR interactions. To validate the framework, we performed multiple studies to:1) determine an effective prompt method for the LLM to construct the KG; 2) demonstrate benefits of our 2D nested object model combined with AR model. © 2024 Prognostics and Health Management Society. All rights reserved.},
keywords = {Augmented Reality, Automated solutions, Customer loyalty, Customer satisfaction, Customers' satisfaction, Diagnosis, Equipment diagnosis, Failure Diagnosis, Failure repairs, High quality, Knowledge graphs, Language Model, Quality of Service, Query languages, Sales, Support services},
pubstate = {published},
tppubtype = {inproceedings}
}