期刊文献+

融合时间卷积网络的空间自适应运动风格迁移方法

A Spatially Adaptive Motion Style Transfer Method with Temporal Convolutional Network
下载PDF
导出
摘要 针对人体运动生成方法严重依赖有监督学习和成对数据集的问题,受图像风格迁移方法启发,提出一个融合时间卷积网络的空间自适应运动风格迁移模型,通过输入虚拟角色运动行为,生成不同的运动风格运动序列.首先,从时序性和空间性考虑,设计以时间卷积网络为主干的神经网络框架,从无配对的数据集中准确地提取运动数据的内容特征和风格特征.其次,基于空间自适应归一化方法,将其改进并适配到运动解码器中,提出一种适用于运动风格迁移的自适应方法.最后,针对运动迁移中的误差问题,在网络中引入正向运动学对沿运动链传递的关节误差进行约束,实现了对足部运动行为的约束.为验证所提方法性能,使用CMU和Xia开源数据集依次从主成分分析、数据聚类、可视化等指标进行实验.结果表明,提出的模型可以有效实现多类非配对数据的风格迁移,所生成的动画效果自然真实,具有良好的互动性和扩展性,可广泛应用于数字动画相关的虚拟人建模中. To address the issue of the human motion generation methods heavily relying on supervised learning and paired datasets,inspired by image style transfer methods,a spatially adaptive motion style transfer model incorporating temporal convolutional networks is proposed.By inputting virtual character motion behaviors,the model generates motion sequences of different styles.Firstly,considering temporal and spatial factors,a neural network framework with temporal convolutional network as the backbone is designed to accurately extract the content features and style features of motion data from the unpaired datasets.Then,based on the spatial adaptive normalization method,an adaptive method suitable for motion style transfer is proposed by improving and adapting it to the motion decoder.Finally,for the errors in motion transfer,the foot behavior is achieved by introducing forward kinematics in the network to constrain the joint errors transmitted along the motion chain.To verify the performance of the proposed method,experiments are conducted using the CMU and Xia open-source datasets in terms of principal component analysis,data clustering,and visualization.The results show that the proposed model can effectively achieve style transfer for multiple unpaired datasets,with natural and realistic animation effects,good interaction and scalability,and can be widely applied to virtual human modeling in computer animation.
作者 张凤全 李凭辙 雷劼睿 Zhang Fengquan;Li Pingzhe;Lei Jierui(School of Digital Media and Design Arts,Beijing University of Posts and Telecommunications,Beijing 100876;School of Information Science,North China University of Technology,Beijing 100144)
出处 《计算机辅助设计与图形学学报》 EI CSCD 北大核心 2024年第10期1653-1662,共10页 Journal of Computer-Aided Design & Computer Graphics
基金 国家自然科学基金(61402016) 教育部人文社科基金(19YJC760150).
关键词 深度学习 运动风格迁移 时间卷积网络 空间自适应实例归一化 deep learning motion style transfer temporal convolutional network spatially adaptive instance normalization
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部