AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
Here you can find the complete list of our publications.
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Stroinski, M.; Kwarciak, K.; Kowalewski, M.; Hemmerling, D.; Frier, W.; Georgiou, O.
Text-to-Haptics: Enhancing Multisensory Storytelling through Emotionally Congruent Midair Haptics Journal Article
In: Advanced Intelligent Systems, vol. 7, no. 4, 2025, ISSN: 26404567 (ISSN).
Abstract | Links | BibTeX | Tags: Audiovisual, Augmented Reality, Extended reality, Haptic interfaces, Haptics, Haptics interfaces, HMI, hybrid AI, Hybrid artificial intelligences, Metaverses, Mixed reality, Multisensory, Natural Language Processing, perception, Sentiment Analysis, Sound speech, Special issue and section, Speech enhancement, Virtual environments, Visual elements
@article{stroinski_text–haptics_2025,
title = {Text-to-Haptics: Enhancing Multisensory Storytelling through Emotionally Congruent Midair Haptics},
author = {M. Stroinski and K. Kwarciak and M. Kowalewski and D. Hemmerling and W. Frier and O. Georgiou},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105002269591&doi=10.1002%2faisy.202400758&partnerID=40&md5=a4c8ce7a01c9bc90d9805a81d34df982},
doi = {10.1002/aisy.202400758},
issn = {26404567 (ISSN)},
year = {2025},
date = {2025-01-01},
journal = {Advanced Intelligent Systems},
volume = {7},
number = {4},
abstract = {In multisensory storytelling, the integration of touch, sound, speech, and visual elements plays a crucial role in enhancing the narrative immersion and audience engagement. In light of this, this article presents a scalable and intelligent hybrid artificial intelligence (AI) method that uses emotional text analysis for deciding when and what midair haptics to display alongside audiovisual content generated by latent stable diffusion methods. Then, a user study involving 40 participants is described, the results of which suggest that the proposed approach enhances the audience level of engagement as they experience a short AI-generated multisensory (audio–visual–haptic) story. © 2024 The Author(s). Advanced Intelligent Systems published by Wiley-VCH GmbH.},
keywords = {Audiovisual, Augmented Reality, Extended reality, Haptic interfaces, Haptics, Haptics interfaces, HMI, hybrid AI, Hybrid artificial intelligences, Metaverses, Mixed reality, Multisensory, Natural Language Processing, perception, Sentiment Analysis, Sound speech, Special issue and section, Speech enhancement, Virtual environments, Visual elements},
pubstate = {published},
tppubtype = {article}
}
In multisensory storytelling, the integration of touch, sound, speech, and visual elements plays a crucial role in enhancing the narrative immersion and audience engagement. In light of this, this article presents a scalable and intelligent hybrid artificial intelligence (AI) method that uses emotional text analysis for deciding when and what midair haptics to display alongside audiovisual content generated by latent stable diffusion methods. Then, a user study involving 40 participants is described, the results of which suggest that the proposed approach enhances the audience level of engagement as they experience a short AI-generated multisensory (audio–visual–haptic) story. © 2024 The Author(s). Advanced Intelligent Systems published by Wiley-VCH GmbH.