期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Multi-Feature Super-Resolution Network for Cloth Wrinkle Synthesis 被引量:1
1
作者 Lan Chen Juntao Ye Xiaopeng Zhang 《Journal of Computer Science & Technology》 SCIE EI CSCD 2021年第3期478-493,共16页
Existing physical cloth simulators suffer from expensive computation and difficulties in tuning mechanical parameters to get desired wrinkling behaviors.Data-driven methods provide an alternative solution.They typical... Existing physical cloth simulators suffer from expensive computation and difficulties in tuning mechanical parameters to get desired wrinkling behaviors.Data-driven methods provide an alternative solution.They typically synthesize cloth animation at a much lower computational cost,and also create wrinkling effects that are similar to the training data.In this paper we propose a deep learning based method for synthesizing cloth animation with high resolution meshes.To do this we first create a dataset for training:a pair of low and high resolution meshes are simulated and their motions are synchronized.As a result the two meshes exhibit similar large-scale deformation but different small wrinkles.Each simulated mesh pair is then converted into a pair of low-and high-resolution"images"(a 2D array of samples),with each image pixel being interpreted as any of three descriptors:the displacement,the normal and the velocity.With these image pairs,we design a multi-feature super-resolution(MFSR)network that jointly trains an upsampling synthesizer for the three descriptors.The MFSR architecture consists of shared and task-specific layers to learn multi-level features when super-resolving three descriptors simultaneously.Frame-to-frame consistency is well maintained thanks to the proposed kinematics-based loss function.Our method achieves realistic results at high frame rates:12-14 times faster than traditional physical simulation.We demonstrate the performance of our method with various experimental scenes,including a dressed character with sophisticated collisions. 展开更多
关键词 cloth animation deep learning multi-feature SUPER-RESOLUTION wrinkle synthesis
原文传递
Motion-Inspired Real-Time Garment Synthesis with Temporal-Consistency
2
作者 魏育坤 石敏 +2 位作者 冯文科 朱登明 毛天露 《Journal of Computer Science & Technology》 SCIE EI CSCD 2023年第6期1356-1368,共13页
Synthesizing garment dynamics according to body motions is a vital technique in computer graphics.Physics-based simulation depends on an accurate model of the law of kinetics of cloth,which is time-consuming,hard to i... Synthesizing garment dynamics according to body motions is a vital technique in computer graphics.Physics-based simulation depends on an accurate model of the law of kinetics of cloth,which is time-consuming,hard to implement,and complex to control.Existing data-driven approaches either lack temporal consistency,or fail to handle garments that are different from body topology.In this paper,we present a motion-inspired real-time garment synthesis workflow that enables high-level control of garment shape.Given a sequence of body motions,our workflow is able to gen-erate corresponding garment dynamics with both spatial and temporal coherence.To that end,we develop a transformer-based garment synthesis network to learn the mapping from body motions to garment dynamics.Frame-level attention is employed to capture the dependency of garments and body motions.Moreover,a post-processing procedure is further tak-en to perform penetration removal and auto-texturing.Then,textured clothing animation that is collision-free and tempo-rally-consistent is generated.We quantitatively and qualitatively evaluated our proposed workflow from different aspects.Extensive experiments demonstrate that our network is able to deliver clothing dynamics which retain the wrinkles from the physics-based simulation,while running 1000 times faster.Besides,our workflow achieved superior synthesis perfor-mance compared with alternative approaches.To stimulate further research in this direction,our code will be publicly available soon. 展开更多
关键词 clothing animation computer graphics TRANSFORMER temporal consistency
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部