期刊文献+

弹性网络核极限学习机的多标记学习算法 被引量:5

Multi-label learning algorithm of an elastic net kernel extreme learning machine
下载PDF
导出
摘要 将正则化极限学习机或者核极限学习机理论应用到多标记分类中,一定程度上提高了算法的稳定性。但目前这些算法关于损失函数添加的正则项都基于L2正则,导致模型缺乏稀疏性表达。同时,弹性网络正则化既保证模型鲁棒性且兼具模型稀疏化学习,但结合弹性网络的极限学习机如何解决多标记问题鲜有研究。基于此,本文提出一种对核极限学习机添加弹性网络正则化的多标记学习算法。首先,对多标记数据特征空间使用径向基核函数映射;随后,对核极限学习机损失函数施加弹性网络正则项;最后,采用坐标下降法迭代求解输出权值以得到最终预测标记。通过对比试验和统计分析表明,提出的算法具有更好的性能表现。 Regularized extreme learning machine or kernel extreme learning machine theory was applied to multi-label classification,which improves the stability of the algorithm to a certain extent.However,the regularization terms added by these algorithms for loss functions are all based on L2 regularization,which leads to the lack of sparse expression of the model.Simultaneously,elastic net regularization guarantees both model robustness and model sparse learning.Nevertheless,there is insufficient research on how to solve multi-label learning problems by combining elastic net kernel extreme learning machines.Based on this hypothesis,this paper proposes a multi-label learning algorithm that adds elastic network regularization to kernel extreme learning machines.It first uses radial basis function mapping for feature spacing of multi-label;subsequently,it applies the elastic net regularization to the loss function of kernel extreme learning machine.Finally,it uses the coordinate descent method to iteratively solve the output weights to get the final prediction labels.Through comparative experiments and statistical analyses,the proposed method demonstrates better performance.
作者 王一宾 裴根生 程玉胜 WANG Yibin;PEI Gensheng;CHENG Yusheng(School of Computer and Information,Anqing Normal University,Anqing 246011,China;The University Key Laboratory of Intelligent Perception and Computing of Anhui Province,Anqing 246011,China)
出处 《智能系统学报》 CSCD 北大核心 2019年第4期831-842,共12页 CAAI Transactions on Intelligent Systems
基金 安徽省高校重点科研项目(KJ2017A352) 安徽省高校重点实验室基金项目(ACAIM160102)
关键词 多标记学习 核极限学习机 正则化 弹性网络 径向基函数 坐标下降法 multi-label learning kernel extreme learning machine regularization elastic net radial basis function coordinate descent
  • 相关文献

参考文献1

二级参考文献17

  • 1Hornik K. Approximation capabilities of multilayer feedforward networks. Neural Networks, 1991, 4(2): 251-257.
  • 2Leshno M, Lin V Y, Pinkus A, Schocken S. Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Networks, 1993, 6(6) : 861-867.
  • 3Huang G-B, Babri H A. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Transactions on Neural Networks, 1998, 9(1): 224-229.
  • 4Huang G-B. Learning capability and storage capacity of two hidden-layer feedforward networks. IEEE Transactions on Neural Networks, 2003, 14(2): 274-281.
  • 5Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: Theory and applications. Neurocomputing, 2006, 70 (1-3): 489-501.
  • 6Vapnik V N. The Nature of Statistical Learning Theory. New York: Springer, 1995.
  • 7Rousseeuw P J, Leroy A. Robust Regression and Outlier Detection. New York: Wiley, 1987.
  • 8Rumelhart D E, McClelland J L. Parallel Distributed Processing. Cambridge.. MIT Press, 1986, 1(2): 125-187.
  • 9Cristianini N, Shawe-Taylor J. An Introduction to Support Vector Machines. Cambridge: Cambridge University Press, 2000.
  • 10Tamura S, Tateishi M. Capabilities of a four-layered feedforward neural network: Four layers versus three. IEEE Transactions on Neural Networks, 1997, 8(2): 251-255.

共引文献161

同被引文献31

引证文献5

二级引证文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部