期刊文献+

商空间粒变换的深度特征表示

Deep Feature Representation Based on Granular Transaction of Quotient Space Theory
下载PDF
导出
摘要 目前,大数据问题亟待解决,关键就是对问题的特征描述.目前特征描述最流行的理论是深度学习理论,但深层结构共需要多少层,每层需要多少特征?这是深度学习最需要解决的问题.引入商空间理论对深度学习理论进行改进,根据粒度变换原理对问题特征进行深层表示,克服深度学习理论中深度不确定,特征描述不明确的缺点.首先根据商空间理论的粒度变换原则,在多粒度空间分层描述问题特征,从而形成多层的深度特征表示.接着,根据商空间粒度变换的描述特性,在不同粒度空间对问题进行求解.最后,作者选取Letter-recognition数据集进行实验,实验结果表明本文所提的深度特征表示法可以自动将问题分为多层结构,分层描述问题的特征,提升了问题求解精度. At present, big data is a matter of urgent doubt. The key of this issue is feature representation of problems. The most popular theory of feature representation is Deep Learning. But how much layers are need? How much features are need in each layer? These are our most urgent problem in Deep Learning. In this paper, author introduces Quotient Space Theory { QST ) to improve Deep Learn- ing Theory. The feature is represented automatically by deep layers. So the uncertainty of number of deep layers is overcome, and the fault is also conquered that feature representation is indefinite. Author hierarchically descries features of problem in multi-granular spaces to form multilayer feature representation. The problem is solved in different granular space based on the granular transaction principle of QST. The experiment on Letter-recognition data set shows that the deep feature representation proposed in this paper ex- presses the whole problem with hierarchical structure by itself. The feature is described hierarchically and the solution precision is raised.
作者 陈洁 张燕平
出处 《小型微型计算机系统》 CSCD 北大核心 2014年第11期2494-2497,共4页 Journal of Chinese Computer Systems
基金 国家重点基础研究发展计划项目(2007CB311003)资助 国家自然科学基金项目(61073117 61175046)资助 安徽省自然科学基金项目(11040606M145)资助 安徽大学青年科研基金项目(KJQN1118)资助
关键词 特征描述 深度表示 商空间理论 分层递阶 feature representation deep representation quotient space theory hierarchy
  • 相关文献

参考文献20

  • 1Bengio Y,Courville A. Deep learning of representations[ M]. Ber- lin:Handbook on Neural Information Processing. Springer Berlin Heidelberg ,2013 : 1-28.
  • 2Hinton G E, Osindero S, Tell Y W. A fast learning algorithm for deep belief nets[ J ]. Neural Computation,2006,18(7 ) : 1527-1554.
  • 3Bengio Y. Learning deep architectures for AI[M]. Hanoier: Now publishers Inc ,2009,2( 1 ) :1-127.
  • 4Bengio Y,Courville A,Vincent P. Representation learning :a review and new perspectives [ J ]. Technical Report,2012,arXiv: 1206.5538.
  • 5Ciresan D, Meier U, Schmidhuber J. Multi-column deep neural net- works for image classification [ J ]. Technical Report, 2012, arXiv: 1202. 2745.
  • 6Dahl G E, Yu D, Deng L, et al. Contextdependent pre-trained deep neural networks for large vocabulary speech recognition [ J ]. IEEE Transactions on Audio, Speech,and Language Processing, (IEEE T AUDIO SPEECH) ,2012,20( 1 ) :33-42.
  • 7Huang P, Kumar K, Liu C, et al. Predicting speech recognition con- fidence using deep learning with word identity and score features [ C ]. the 38th International Conference on Acoustics, Speech and Signal Processing(ICASSP) ,2013.
  • 8Green G P, Bean J C, Peterson D J. Deep learning in intermediate :using scaffolding assignments to teach theory and promote transfer[ J ]. The Journal of Economic Education ,2013,44 (2) :142-157.
  • 9Zhang Bo ,Zhang Ling. Theory and applications of problem solving [ M ]. Beijing: Tsinghua University Press, 1990.
  • 10Zhang Ling, Zhang Bo. Theory and applications of problem sol- ving-the quotient space granular computation theory and applica- tions (the second version ) [ M ]. Beijing: Tsinghua University Press, 2007.

二级参考文献32

  • 1齐晓东,刘秋成,米据生.广义模糊粗糙集的不确定性度量[J].模糊系统与数学,2007,21(2):136-140. 被引量:6
  • 2王熙照,杨晨晓.分支合并对决策树归纳学习的影响[J].计算机学报,2007,30(8):1251-1258. 被引量:17
  • 3LAM W,KEUNG C L.Learning good prototypes for classification using filtering and Abstraction of instance[J].Pattern Recognition,2002,35:1491-1506.
  • 4HUANG Long-jun,HUANG Ming-he.A new method for constructing decision tree based on rough set theory[C] //Proc of IEEE International Conference on Granular Computing.Washington DC:IEEE Computer Society,2007:241.
  • 5SKOWRON A,RAUSZER C.The discernibility matrices and functions in information systems[M].Netherlands:Kluwer Academic Publishers,1991.
  • 6PAWLAK Z.Rough sets:theoretical aspects of reasoning about data[M].Norwell,MA:Kluwer Academic Publishers,1992.
  • 7Chen Q C,Neural Networks,1994年,5卷,7期,1477页
  • 8Baum E B,Neural Information Processing,1991年,904页
  • 9Tenenbaum J B, de Silva V, Langford J C. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 2000, 290(5500): 2319-2323
  • 10Roweis S T, Saul I. K. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 2000, 290 (5500) : 2323- 2326

共引文献397

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部