Background Intelligent garments,a burgeoning class of wearable devices,have extensive applications in domains such as sports training and medical rehabilitation.Nonetheless,existing research in the smart wearables dom...Background Intelligent garments,a burgeoning class of wearable devices,have extensive applications in domains such as sports training and medical rehabilitation.Nonetheless,existing research in the smart wearables domain predominantly emphasizes sensor functionality and quantity,often skipping crucial aspects related to user experience and interaction.Methods To address this gap,this study introduces a novel real-time 3D interactive system based on intelligent garments.The system utilizes lightweight sensor modules to collect human motion data and introduces a dual-stream fusion network based on pulsed neural units to classify and recognize human movements,thereby achieving real-time interaction between users and sensors.Additionally,the system incorporates 3D human visualization functionality,which visualizes sensor data and recognizes human actions as 3D models in real time,providing accurate and comprehensive visual feedback to help users better understand and analyze the details and features of human motion.This system has significant potential for applications in motion detection,medical monitoring,virtual reality,and other fields.The accurate classification of human actions contributes to the development of personalized training plans and injury prevention strategies.Conclusions This study has substantial implications in the domains of intelligent garments,human motion monitoring,and digital twin visualization.The advancement of this system is expected to propel the progress of wearable technology and foster a deeper comprehension of human motion.展开更多
基金Supported by the National Natural Science Foundation of China (62202346)Hubei Key Research and Development Program (2021BAA042)+3 种基金Open project of Engineering Research Center of Hubei Province for Clothing Information (2022HBCI01)Wuhan Applied Basic Frontier Research Project (2022013988065212)MIIT′s AI Industry Innovation Task Unveils Flagship Projects (Key Technologies,Equipment,and Systems for Flexible Customized and Intelligent Manufacturing in the Clothing Industry)Hubei Science and Technology Project of Safe Production Special Fund (Scene Control Platform Based on Proprioception Information Computing of Artificial Intelligence)。
文摘Background Intelligent garments,a burgeoning class of wearable devices,have extensive applications in domains such as sports training and medical rehabilitation.Nonetheless,existing research in the smart wearables domain predominantly emphasizes sensor functionality and quantity,often skipping crucial aspects related to user experience and interaction.Methods To address this gap,this study introduces a novel real-time 3D interactive system based on intelligent garments.The system utilizes lightweight sensor modules to collect human motion data and introduces a dual-stream fusion network based on pulsed neural units to classify and recognize human movements,thereby achieving real-time interaction between users and sensors.Additionally,the system incorporates 3D human visualization functionality,which visualizes sensor data and recognizes human actions as 3D models in real time,providing accurate and comprehensive visual feedback to help users better understand and analyze the details and features of human motion.This system has significant potential for applications in motion detection,medical monitoring,virtual reality,and other fields.The accurate classification of human actions contributes to the development of personalized training plans and injury prevention strategies.Conclusions This study has substantial implications in the domains of intelligent garments,human motion monitoring,and digital twin visualization.The advancement of this system is expected to propel the progress of wearable technology and foster a deeper comprehension of human motion.