摘要
基于非凸光滑损失的鲁棒支持向量机分类模型对异常点具有鲁棒性,但已有求解算法需迭代求解二次规划,计算量大且收敛速度慢,不适合训练大规模数据问题。为了克服这些缺点,首先给出收敛速度更快的方法求解鲁棒支持向量机模型;然后基于最小二乘的思想,提出了一种推广的指数鲁棒最小二乘支持向量机模型及其快速收敛的求解算法,并从理论上解释了模型的鲁棒性;最后利用核矩阵的低秩近似,提出了适于处理大规模训练问题的稀疏鲁棒支持向量机算法和稀疏指数鲁棒最小二乘支持向量机算法。实验结果表明,新算法在收敛速度、测试精度和训练时间等方面均优于相关算法。
Based on nonconvex and smooth loss,the robust support vector machine(RSVM)is insenstive to outliers for classification problems.However,the existing algorithms for RSVM are not suitable for dealing with large-scale problems,because they need to iteratively solve quadratic programmings,which leads to a large amount of calculation and slow convergence.To overcome this drawback,the method with a faster convergence rate is used to solve the RSVM.Then,by using the idea of least square,a generalized exponentially robust LSSVM(ER-LSSVM)model is proposed,which is solved by the algorithm with a faster convergence rate.Moreover,the robustness of the ER-LSSVM is interpreted theoretically.Finally,ultilizing low-rank approximation of the kernel matrix,the sparse RSVM algorithm(SR-SVM)and sparse ER-LSSVM algorithm(SER-LSSVM)are proposed for handing large-scale problems.Many experimental results illustrate that the proposed algorithm outperforms the related algorithms in terms of convergence speed,test accuracy and training time.
作者
安亚利
周水生
陈丽
王保军
AN Yali;ZHOU Shuisheng;CHEN Li;WANG Baojun(School of Mathematics and Statistics,Xidian Univ.,Xi'an 710071,China)
出处
《西安电子科技大学学报》
EI
CAS
CSCD
北大核心
2019年第1期64-72,共9页
Journal of Xidian University
基金
国家自然科学基金(61772020)
关键词
鲁棒支持向量机
非凸光滑损失
稀疏解
低秩近似
robust support vector machines
nonconvex and smooth loss
sparse solution
low-rank approximation