期刊文献+

基于熵的最小二乘支持向量机增长记忆算法与实证分析

Arithmetic of Increase Remembrance on Least Squares Support Vector Machine Based on Entropy and Its Demonstration Analysis
下载PDF
导出
摘要 为了减少预测模型的训练样本数和训练时间,提高预测的正确率,将信息论中熵的概念和增长记忆算法引入企业财务困境预测,提出了一种基于熵的最小二乘支持向量机(LS-SVM)增长记忆算法,该算法不必每次都求解矩阵的逆,提高了算法的有效性;通过实验,给出了适合企业财务困境预测的离散的信息熵和核函数的表达式。将该算法与传统LS-SVM以及标准SVM的分析比较,可以看出,在ST前1~3年的不同时点上,基于熵的LS-SVM增长记忆算法无论是训练样本的数量还是运算时间,都显著优于传统的LS-SVM以及标准的SVM,证实了将信息熵和增长记忆算法应用于企业财务困境预测的有效性和优越性。 In order to reduce the training samples number and training time of prediction models and improve the correct rate of prediction,this paper applies the concept of entropy from information theory and increase remembrance arithmetic into enterprise's financial distress prediction.This arithmetic doesn't have to solve the matrix inverse every time so the effectiveness is increased.Through the experiment,this article gets the discrete information entropy and kernel function expressions which are fit for the financial distress prediction.Compared with the traditional algorithm of LS-SVM increase remembrance and standard SVM,this paper concludes that,in different time of 1-3 year before ST,LS-SVM increase remembrance arithmetic based on entropy are significantly superior to the traditional LS-SVM and standard SVM regardless of the training samples number or computing time.It also proves the superiority and effectiveness of applying the concept of information entropy and increase remembrance arithmetic into financial distress prediction.
作者 赵冠华
出处 《运筹与管理》 CSCD 北大核心 2010年第4期38-44,共7页 Operations Research and Management Science
基金 山东省科技攻关计划资助项目(2008GG30009005) 山东省软科学研究计划资助项目(2008RKA223)
关键词 最小二乘支持向量机 信息熵 增长记忆算法 SVM 财务困境预测 least squares support vector machine entropy increase remembrance arithmetic support vector machine financial distress prediction
  • 相关文献

参考文献12

  • 1蒋宗礼.人土神经网络导论[M].第二版.北京:高等教育出版社,2001.78-80.
  • 2Comes C,Vapnik V N.Support-vector networks[J].Machine Learning,1995,8:273-297.
  • 3Borer B,Guyon I,Vapnik V N.A training algorithm for optimal margin classifiers[J].Fifth Annual Workshop on Computational Learning Theory.Pittsburgh:ACM Press,1992,7:156-160.
  • 4Scholkopf B,Burger C,Vapnik V N.Extracting support data for a given task[C].In:Fayyad UM,Uthurusamy R.(eds.).Proceedings of First International Conference on Knowledge Discovery and DataMining[J].AAAI Press,1995.262-267.
  • 5Osuna E,Freund R,et al.An improved training algorithm for support vector machines[C].IEEE Workshop on Neural Networks and Signal Processing,Amelia Island,1997.276-285.
  • 6Platt J C.Fast training of support vector machines using sequential minimal optimization[J].in Advances in Kernel Methods-Support Vector Learning,Cambridge,Massachusetts:The MIT Press,1999,4:185-208.
  • 7Suykens J A K,Vandewalle J.Least squares support vector machine classifiers[J].Neural Processing Letter,1999,9:293-300.
  • 8Suykens J A K,Lukas L,Wandewalle J.Sparse approximation using least squares support vector machines[C].In Proceeding of the IEEE International Symposium onCircuits and Systems(ISCAS 2000),2000.757-760.
  • 9Romdhani S,Torn P,Scholkopf B,Blake A.Efficient face detection by a cascaded support-vector machine expansion[C].Proceeding of the Royal Society of London Series.A-Mathematical Physical and Engineering Sciences,2004.3283-3297.
  • 10Shih P C,Liu C J.Face detection using discriminating feature analysis and support vector machine[J].Pattern Recognition,2006,2:260-276.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部