期刊文献+

一种求解支持向量回归波束形成的算法

An Algorithm for Beamforming Based on Support Vector Regression
下载PDF
导出
摘要 在对传统求解支持向量回归算法研究与分析的基础上,针对支持向量回归模型,结合支持向量回归的波束形成技术,提出了一种利用迭代重加权最小二乘支持向量回归波束形成的算法,并对具有严重干扰的接收信号进行了数值仿真试验和对比分析。结果表明:基于迭代重加权最小二乘支持向量回归波束形成的算法不同于传统的标准二次型算法,收敛速度快,干扰抑制强,计算量小,降低了计算复杂度,避免了二次规划技术的高计算成本,提高了算法效率,并保持了良好的泛化能力,具有一定的参考价值。 Utilizing standard quadratic optimization techniques, the traditional SVR(Support Vector Regression) algorithm requires long running time and large memory space because of high degree of computational complexity. On the basis of research and analysis of the traditional support vector regression algorithm, in view of support vector regression model, the IRWLS (herative Re-Weighted Least Squares) algorithm is proposed with reference to the technology of SVR beamforming. Then the specific solution process and flowchart of the algorithm are given. Finally, the numerical simulation experiments and comparative analysis of the serious interference on the received signal are done. Compared with the traditional standard quadratic algorithm based on QP, the IRWLS algorithm has the advantages of rapid convergence, robust to interference and small amount of calculation. Furthermore, it could not only reduce the computational complexity and avoid the high computing costs of quadratic programming techniques, but also promote the efficiency of the algorithm and maintain good generalization ability. Therefore, it has a certain reference value.
出处 《电声技术》 2010年第5期67-70,共4页 Audio Engineering
  • 相关文献

参考文献8

二级参考文献29

  • 1Vapnik V. Statisical Learning Theory[M]. New York : Wiley, 1998.
  • 2Mangasarian O L. Lagrangian Support Vector Machine [R]. Technical Report 00-06, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, June,2000.
  • 3Mangasarian O L, David R. Active support vector machines ctassification[J~. Advances in Neural Information Processing Systems (NIPS 2000),2000.
  • 4Lee Y J, Mangasarian O L. SSVM: A smooth support vector machines for classification[J]. Computational Optimazation and Applieations, 2000,20(1 ) : 5-22.
  • 5Mangaaarian O L, David R. Successive over relaxation for support vector machines[J]. IEEE Trans On Neural Networks, 1999,10(5):1032-1037.
  • 6Benson H Y, Shanno D F, Vanderbei R J. Interior-point Methods for Nonconvex Nonlinear Programming[M]. Technical report, Department of Electrical Engineering, University of Malaya, Kuala Lumpur, Malaysia, August, 2002.
  • 7Fletcher R, Leyffer S. Nonlinear programming without a penalty function[J]. Mathematical Programming,2002, 91:239-270.
  • 8Ulbrich M, Ulbrich S, Vicente L N. A globally convergent primal-dual interior-point filter method for nonlinear programming[J]. Mathematical Programming, 2003,100 : 379-410.
  • 9[1]Vapnik VN. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
  • 10[2]Cherkassky V, Mulier F. Learning from Data-Concepts, Theory and Methods. New York: John Wiley Sons, 1998.

共引文献2305

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部