期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Low rank optimization for efficient deep learning:making a balance between compact architecture and fast training
1
作者 OU Xinwei CHEN Zhangxin +1 位作者 ZHU Ce LIU Yipeng 《Journal of Systems Engineering and Electronics》 SCIE CSCD 2024年第3期509-531,F0002,共24页
Deep neural networks(DNNs)have achieved great success in many data processing applications.However,high computational complexity and storage cost make deep learning difficult to be used on resource-constrained devices... Deep neural networks(DNNs)have achieved great success in many data processing applications.However,high computational complexity and storage cost make deep learning difficult to be used on resource-constrained devices,and it is not environmental-friendly with much power cost.In this paper,we focus on low-rank optimization for efficient deep learning techniques.In the space domain,DNNs are compressed by low rank approximation of the network parameters,which directly reduces the storage requirement with a smaller number of network parameters.In the time domain,the network parameters can be trained in a few subspaces,which enables efficient training for fast convergence.The model compression in the spatial domain is summarized into three categories as pre-train,pre-set,and compression-aware methods,respectively.With a series of integrable techniques discussed,such as sparse pruning,quantization,and entropy coding,we can ensemble them in an integration framework with lower computational complexity and storage.In addition to summary of recent technical advances,we have two findings for motivating future works.One is that the effective rank,derived from the Shannon entropy of the normalized singular values,outperforms other conventional sparse measures such as the?_1 norm for network compression.The other is a spatial and temporal balance for tensorized neural networks.For accelerating the training of tensorized neural networks,it is crucial to leverage redundancy for both model compression and subspace training. 展开更多
关键词 model compression subspace training effective rank low rank tensor optimization efficient deep learning
下载PDF
Dynamic background modeling using tensor representation and ant colony optimization 被引量:1
2
作者 PENG LiZhong ZHANG Fan ZHOU BingYin 《Science China Mathematics》 SCIE CSCD 2017年第11期2287-2302,共16页
Background modeling and subtraction is a fundamental problem in video analysis. Many algorithms have been developed to date, but there are still some challenges in complex environments, especially dynamic scenes in wh... Background modeling and subtraction is a fundamental problem in video analysis. Many algorithms have been developed to date, but there are still some challenges in complex environments, especially dynamic scenes in which backgrounds are themselves moving, such as rippling water and swaying trees. In this paper, a novel background modeling method is proposed for dynamic scenes by combining both tensor representation and swarm intelligence. We maintain several video patches, which are naturally represented as higher order tensors,to represent the patterns of background, and utilize tensor low-rank approximation to capture the dynamic nature. Furthermore, we introduce an ant colony algorithm to improve the performance. Experimental results show that the proposed method is robust and adaptive in dynamic environments, and moving objects can be perfectly separated from the complex dynamic background. 展开更多
关键词 background modeling dynamic scenes tensor representation ant colony optimization
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部