摘要
针对传统二次规划求解方法训练优化极限学习机(OMELM)存在速度慢和效率低的问题,提出了单变量迭代序列最小优化(SSMO)算法.该算法通过在框式约束中优化拉格朗日乘子来实现目标函数的最小化:首先在初始化拉格朗日乘子中选择使目标函数值下降最大的拉格朗日乘子,将该拉格朗日乘子作为目标函数的唯一变量;然后求解目标函数的最小值并更新该变量的值;重复这个过程直到所有的拉格朗日乘子都满足二次规划问题的Karush-Kuhn-Tucker条件为止.实验结果表明:SSMO算法只需调节很少的参数值便可得到足够好的泛化性能;采用SSMO算法的OMELM方法在泛化性能上要好于采用序列最小优化算法的支持向量机方法;在随机数据集测试中,SSMO算法具有较好的鲁棒性.
A sequential minimal optimization (SSMO) algorithm iterated with single variable is proposed to solve the slow speed and low efficiency problems in training optimization method based extreme learning machine (OMELM) by traditional quadratic programming solver. The algorithm searches the minimum value of the objective function by optimizing Lagrange multipliers within box constraints. The Lagrange multiplier that can generate maximum reduction in the objective function is selected from the initial Lagrange multipliers as for the unique variable of the objective function. Then the objective function is minimized with respect to the variable, and a new value of the Lagrange multiplier is obtained. The process is repeated until all Lagrange mul- tipliers satisfy the Karush-Kuhn-Tucker condition of the quadratic programming problem. Experimental results show that SSMO algorithm can obtain good enough generalization performance with adjusting few parameter values. The generalization performance of OMELM method using SSMO algorithm is better than that of the support vector machine (SVM) method using sequential minimal optimization (SMO) algorithm. SSMO algorithm is robust in random datasets trails.
出处
《西安交通大学学报》
EI
CAS
CSCD
北大核心
2011年第6期7-12,19,共7页
Journal of Xi'an Jiaotong University
基金
国家高技术研究发展计划资助项目(2008AA01Z136)
关键词
极限学习机
支持向量机
序列最小优化
extreme learning machine
support vector machine
sequential minimal optimization