期刊文献+

GSVM优化问题的一种新的光滑函数法 被引量:2

Smoothing function method for generalized support vector machine optimization problems
下载PDF
导出
摘要 提出求解广义支撑向量机(GSVM)优化问题的一种新的光滑函数法,克服了已有算法收敛速度慢且计算结构复杂的缺陷。首先利用最优化理论的KKT互补条件,将GSVM转化为无约束优化问题,然后给出了基于Newton型迭代的光滑函数的迭代方法。给出了这种光滑函数的有关性质、迭代算法的迭代格式及其收敛性。通过理论分析及数值实验证明了该算法对初始点不敏感,且收敛速度快、数值稳定。从而验证了算法的可行性和有效性。 Novel soothing function method for generalized support vector machine (GSVM) is proposed and attempts some drawbacks of former method which are complex, subtle, and sometimes difficult to implement. First, used KKT complementaritys condition in optimization theory, unconstrained nondifferential optimization model are built. Then approximate function is given. Finally, the data set with standard unconstraint optimization Newton method is trained. The property of the smoothing function and convergence of the algorithm are obtained. This algorithm is fast and insensitive to initial point. Theory analysis and primary numerical results illustrate that smoothing function method for GSVM is feasible and effective.
出处 《系统工程与电子技术》 EI CSCD 北大核心 2007年第6期982-985,共4页 Systems Engineering and Electronics
关键词 最优化 广义支撑向量机 光滑函数 算法 optimization generalized support vector machine smoothing function algorithm
  • 相关文献

参考文献12

  • 1Vapnik V.The nature of statistical learning theory[M].New York:Springer-Verlag,1995.
  • 2Vapnik V.An overview of statistical learning theory[J].IEEE Transaction on Neural Networks,1999;10(5):988-999.
  • 3Burges J C.A tutorial on support vector mechines for recognition[J].Data Mining and Knowledge Discovery,1998,2(2):121-167.
  • 4Smola A J,Scholkopf B.A tutorial on support vector regression[R].Royal Holloway College,London,U.K.,NeuroCOLT2 Tech.Rep.NC2-TR-1998-030,1998.
  • 5Platt J C.Fast training of SVMs using sequential minimal optimization[C]// Advances in Kernel Methods-Support Vector Learning,1998:185-208.
  • 6Shevade S K,Keerthi S S,Bhattacharyya C,et al.Improvements to SMO algorithm for regression[J].IEEE Transaction on Neural Networks,2000,11(5):1188-1183.
  • 7Mangasarian O L.Generalized support vector machines[C]// Smola A J,Bartlett P,Scholkopf B,Schuurmans D.Advances in Large Margin Classifiers,MIT Press,2000:135-146.
  • 8Lee Y J,Mangasarian O L.SSVM:a smooth support vector machines for classification[J].Computational Optimization and Applications,2001(20):5-22.
  • 9郭崇慧,孙建涛,陆玉昌.广义支持向量机优化问题的极大熵方法[J].系统工程理论与实践,2005,25(6):27-32. 被引量:11
  • 10杨庆之,杨德庄,张敏洪.调节熵函数法[J].计算数学,2001,23(1):80-81. 被引量:22

二级参考文献23

  • 1唐焕文,张立卫,王雪华.一类约束不可微优化问题的极大熵方法[J].计算数学,1993,15(3):268-275. 被引量:75
  • 2唐焕文,张立卫.凸规划的极大熵方法[J].科学通报,1994,39(8):682-684. 被引量:49
  • 3李兴斯.一类不可微优化问题的有效解法[J].中国科学(A辑),1994,24(4):371-377. 被引量:137
  • 4杨庆之.对凝聚函数法的分析[J].计算数学,1996,18(4):405-410. 被引量:17
  • 5Vapnik V. The Nature of Statistical Learning Theory [M]. Springer-Verlag, 1995.
  • 6Vapnik V. Statistical Learning Theory [M]. John Wiley and Sons, Inc., 1998.
  • 7Boser B E, Guyon I, Vapnik V. A training algorithm for optimal margin classifiers [A]. Haussler D. Proceedings of the Fifth Annual ACM Workshop of Computational Learning Theory[C]. ACM Press, 1992, 144-152.
  • 8Cortes C, Vapnik V. Support vector networks [J]. Machine Learning, 1995, 20: 273-297.
  • 9Burges J C. A tutorial on support vector machines for pattern recognition [J]. Data Mining and Knowledge Discovery, 1998,2(2): 121-167.
  • 10Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines [A]. Principe J, Gile L, Morgan N, Wilson E.Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing [C]. IEEE Press, 1997, 276-285.

共引文献29

同被引文献14

引证文献2

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部