AHCI RESEARCH GROUP
Publications
Papers published in international journals,
proceedings of conferences, workshops and books.
OUR RESEARCH
Scientific Publications
How to
You can use the tag cloud to select only the papers dealing with specific research topics.
You can expand the Abstract, Links and BibTex record for each paper.
2025
Wang, Z.; Aris, A.; Zhang, P.
Mobile-Driven Deep Learning Algorithm for Personalized Clothing Design Using Multi-Feature Attributes Journal Article
In: International Journal of Interactive Mobile Technologies, vol. 19, no. 18, pp. 146–160, 2025, ISSN: 18657923 (ISSN), (Publisher: International Federation of Engineering Education Societies (IFEES)).
Abstract | Links | BibTeX | Tags: Clothing design, Convolutional Neural Networks, Data privacy, Data visualization, Deep learning, E-Learning, Electronic commerce, Fashion design, Feature attributes, Hosiery manufacture, Learning algorithms, Learning platform, Learning systems, Mobile Learning, Mobile learning platform, Mobile-driven deep learning, Multi-feature attribute, multi-feature attributes, Multifeatures, Personalized clothing design, Personalized clothings, StyleFitNet, Textiles, Virtual Reality
@article{wang_mobile-driven_2025,
title = {Mobile-Driven Deep Learning Algorithm for Personalized Clothing Design Using Multi-Feature Attributes},
author = {Z. Wang and A. Aris and P. Zhang},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105017860148&doi=10.3991%2Fijim.v19i18.57239&partnerID=40&md5=de3ca359dd178d8ea59cf8da73a9c486},
doi = {10.3991/ijim.v19i18.57239},
issn = {18657923 (ISSN)},
year = {2025},
date = {2025-01-01},
journal = {International Journal of Interactive Mobile Technologies},
volume = {19},
number = {18},
pages = {146–160},
abstract = {Personalized fashion recommendation systems face significant challenges in balancing accurate style prediction, real-time mobile performance, and user privacy compliance. This study presents StyleFitNet, a novel mobile-driven deep learning framework that integrates multiple user feature attributes, including body measurements, fabric preferences, and temporal style evolution, to generate personalized clothing designs. The hybrid convolutional neural networks (CNNs)-recurrent neural networks (RNNs) architecture addresses key limitations of conventional recommendation systems by simultaneously processing spatial features and sequential preference patterns. A comprehensive evaluation demonstrates the system’s superiority in recommendation accuracy, design diversity, and user satisfaction compared to existing approaches. The implementation features GDPR-compliant data handling and a 3D virtual fitting room, significantly reducing return rates while maintaining robust privacy protections. Findings highlight the model’s ability to adapt to evolving fashion trends while preserving individual style preferences, offering both technical and business advantages for e-commerce platforms. The study concludes that StyleFitNet establishes a new standard for artificial intelligence (AI)-driven fashion recommendations, successfully merging advanced personalization with ethical data practices. Key implications include the demonstrated viability of hybrid deep learning models for mobile deployment and the importance of temporal analysis in preference modelling. Future research directions include cross-cultural validation and the integration of generative AI for enhanced visualization. © 2025 Elsevier B.V., All rights reserved.},
note = {Publisher: International Federation of Engineering Education Societies (IFEES)},
keywords = {Clothing design, Convolutional Neural Networks, Data privacy, Data visualization, Deep learning, E-Learning, Electronic commerce, Fashion design, Feature attributes, Hosiery manufacture, Learning algorithms, Learning platform, Learning systems, Mobile Learning, Mobile learning platform, Mobile-driven deep learning, Multi-feature attribute, multi-feature attributes, Multifeatures, Personalized clothing design, Personalized clothings, StyleFitNet, Textiles, Virtual Reality},
pubstate = {published},
tppubtype = {article}
}
2024
He, K.; Yao, K.; Zhang, Q.; Yu, J.; Liu, L.; Xu, L.
DressCode: Autoregressively Sewing and Generating Garments from Text Guidance Journal Article
In: ACM Transactions on Graphics, vol. 43, no. 4, 2024, ISSN: 07300301 (ISSN).
Abstract | Links | BibTeX | Tags: 3D content, 3d garments, autoregressive model, Autoregressive modelling, Content creation, Digital humans, Embeddings, Fashion design, Garment generation, Interactive computer graphics, Sewing pattern, sewing patterns, Textures, Virtual Reality, Virtual Try-On
@article{he_dresscode_2024,
title = {DressCode: Autoregressively Sewing and Generating Garments from Text Guidance},
author = {K. He and K. Yao and Q. Zhang and J. Yu and L. Liu and L. Xu},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85199257820&doi=10.1145%2f3658147&partnerID=40&md5=8996e62e4d9dabb5a7034f8bf4df5a43},
doi = {10.1145/3658147},
issn = {07300301 (ISSN)},
year = {2024},
date = {2024-01-01},
journal = {ACM Transactions on Graphics},
volume = {43},
number = {4},
abstract = {Apparel's significant role in human appearance underscores the importance of garment digitalization for digital human creation. Recent advances in 3D content creation are pivotal for digital human creation. Nonetheless, garment generation from text guidance is still nascent. We introduce a text-driven 3D garment generation framework, DressCode, which aims to democratize design for novices and offer immense potential in fashion design, virtual try-on, and digital human creation. We first introduce SewingGPT, a GPT-based architecture integrating cross-attention with text-conditioned embedding to generate sewing patterns with text guidance. We then tailor a pre-trained Stable Diffusion to generate tile-based Physically-based Rendering (PBR) textures for the garments. By leveraging a large language model, our framework generates CG-friendly garments through natural language interaction. It also facilitates pattern completion and texture editing, streamlining the design process through user-friendly interaction. This framework fosters innovation by allowing creators to freely experiment with designs and incorporate unique elements into their work. With comprehensive evaluations and comparisons with other state-of-the-art methods, our method showcases superior quality and alignment with input prompts. User studies further validate our high-quality rendering results, highlighting its practical utility and potential in production settings. Copyright © 2024 held by the owner/author(s).},
keywords = {3D content, 3d garments, autoregressive model, Autoregressive modelling, Content creation, Digital humans, Embeddings, Fashion design, Garment generation, Interactive computer graphics, Sewing pattern, sewing patterns, Textures, Virtual Reality, Virtual Try-On},
pubstate = {published},
tppubtype = {article}
}