期刊文献+

随机特征上一致中心调节的支持向量机

Support vector machine via consensus centre adjustment on random features
下载PDF
导出
摘要 支持向量机(SVM)是最为流行的分类工具,但处理大规模的数据集时,需要大量的内存资源和训练时间,通常在大集群并行环境下才能实现。提出一种新的并行SVM算法,RF-CCASVM,可在有限计算资源上求解大规模SVM。通过随机傅里叶映射,应用低维显示特征映射一致近似高斯核对应的无限维隐式特征映射,从而用线性SVM一致近似高斯核SVM。提出一致中心调节的并行化方法。具体地,将数据集划分成若干子数据集,多个进程并行地在各自的子数据集上独立训练SVM。当各个子数据集上的最优超平面即将求出时,用由各个子集上获得的一致中心解取代当前解,继续在各子集上训练直到一致中心解在各个子集上达到最优。标准数据集的对比实验验证了RF-CCASVM的正确性和有效性。 Support Vector Machines(SVMs)have become popular classification tools, but when dealing with very large datasets, SVMs need large memory requirement and computation time. Therefore, large-scale SVMs are performed on computer clusters or supercomputers. A novel parallel algorithm for large-scale SVM is presented. The algorithm is per-formed on a resource-limited computing environment and guarantees a uniform convergence. The infinite-dimensional implicit feature mapping of the Gaussian kernel function is sufficiently approximated by a low-dimensional feature map-ping. The kernel SVM is approximated with a linear SVM by explicitly mapping data to low-dimensional features using random the Fourier map. The parallelization of the algorithm is implemented with a consensus centre adjustment strategy. Concretely, the dataset is partitioned into several subsets, and separate SVMs are trained on processors parallel with the subsets. When the optimal hyperplanes on subsets are nearly found, solutions achieved by separate SVMs are replaced by the consensus centre and are retrained on the subsets until the consensus centre is optimal on all subsets. Comparative experiments on benchmark databases are performed. The results show that the proposed resource-limited parallel algo-rithm is effective and efficient.
作者 廖士中 卢玮
出处 《计算机工程与应用》 CSCD 2014年第17期44-48,55,共6页 Computer Engineering and Applications
基金 国家自然科学基金(No.61170019) 天津市自然科学基金(No.11JCYBJC00700)
关键词 并行支持向量机 大规模数据集 有限资源 随机傅里叶特征 一致中心调节 parallel Support Vector Machines (SVM) large-scale datasets limited resource random Fourier features consensus centre adjustment
  • 相关文献

参考文献15

  • 1Vapnik V.The nature of statistical learning theory[M].New York:Springer-Verlag,2000.
  • 2Cao L J,Keerthi S S,Ong C J,et al.Developing parallel sequential minimal optimization for fast training support vector machine[J].Neurocomputing,2006,70(1):93-104.
  • 3Zanghirati G,Zanni L.A parallel solver for large quadratic programs in training support vector machines[J].Parallel Computing,2003,29(4):535-551.
  • 4Chang E Y,Zhu Kaihua,Wang Hao,et al.Parallelizing support vector machines on distributed computers[C]//Advances in Neural Information Processing Systems.Cambridge:MIT Press,2008:257-264.
  • 5Zhu Z A,Chen Weizhu,Wang Gang,et al.P-packsvm:parallel primal gradient descent kernel SVM[C]//Proceedings of the 9th IEEE International Conference on Data Mining.Piscataway:IEEE Press,2009:677-686.
  • 6Collobert R,Bengio S,Bengio Y.A parallel mixture of SVMs for very large scale problems[J].Neural Computation,2002,14(5):1105-1114.
  • 7Graf H P,Cosatto E,Bottou L,et al.Parallel support vector machines:the cascade SVM[C]//Advances in Neural Information Processing Systems.Cambridge:MIT Press,2005:521-528.
  • 8Hazan T,Man A,Shashua A.A parallel decomposition solver for SVM:distributed dual ascend using fenchel duality[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.Piscataway:IEEE Press,2008:1-8.
  • 9Forero P A,Cano A,Giannakis G B.Consensus-based distributed support vector machines[J].Journal of Machine Learning Research,2010,11:1663-1707.
  • 10Rahimi A,Recht B.Random features for large-scale kernel machines[C]//Advances in Neural Information Processing Systems.Cambridge:MIT Press,2008:1177-1184.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部