In this paper, we present a new support vector machines-least squares support vector machines (LS-SVMs). While standard SVMs solutions involve solving quadratic or linear programming problems, the least squaresversion...In this paper, we present a new support vector machines-least squares support vector machines (LS-SVMs). While standard SVMs solutions involve solving quadratic or linear programming problems, the least squaresversion of SVMs corresponds to solving a set of linear equations, due to equality instead of inequality constraints inthe problem formulation. In LS-SVMs, Mercer condition is still applicable. Hence several type of kernels such aspolynomial, RBF's and MLP's can be used. Here we use LS-SVMs to time series prediction compared to radial basisfunction neural networks. We consider a noisy (Gaussian and uniform noise)Mackey-Glass time series. The resultsshow that least squares support vector machines is excellent for time series prediction even with high noise.展开更多
文摘In this paper, we present a new support vector machines-least squares support vector machines (LS-SVMs). While standard SVMs solutions involve solving quadratic or linear programming problems, the least squaresversion of SVMs corresponds to solving a set of linear equations, due to equality instead of inequality constraints inthe problem formulation. In LS-SVMs, Mercer condition is still applicable. Hence several type of kernels such aspolynomial, RBF's and MLP's can be used. Here we use LS-SVMs to time series prediction compared to radial basisfunction neural networks. We consider a noisy (Gaussian and uniform noise)Mackey-Glass time series. The resultsshow that least squares support vector machines is excellent for time series prediction even with high noise.