In this paper, we investigate the linear solver in least square support vector machine(LSSVM) for large-scale data regression. The traditional methods using the direct solvers are costly. We know that the linear equ...In this paper, we investigate the linear solver in least square support vector machine(LSSVM) for large-scale data regression. The traditional methods using the direct solvers are costly. We know that the linear equations should be solved repeatedly for choosing appropriate parameters in LSSVM, so the key for speeding up LSSVM is to improve the method of solving the linear equations. We approximate large-scale kernel matrices and get the approximate solution of linear equations by using randomized singular value decomposition(randomized SVD). Some data sets coming from University of California Irvine machine learning repository are used to perform the experiments. We find LSSVM based on randomized SVD is more accurate and less time-consuming in the case of large number of variables than the method based on Nystrom method or Lanczos process.展开更多
基金Supported by the National Natural Science Foundation of China(10901125,11471253)
文摘In this paper, we investigate the linear solver in least square support vector machine(LSSVM) for large-scale data regression. The traditional methods using the direct solvers are costly. We know that the linear equations should be solved repeatedly for choosing appropriate parameters in LSSVM, so the key for speeding up LSSVM is to improve the method of solving the linear equations. We approximate large-scale kernel matrices and get the approximate solution of linear equations by using randomized singular value decomposition(randomized SVD). Some data sets coming from University of California Irvine machine learning repository are used to perform the experiments. We find LSSVM based on randomized SVD is more accurate and less time-consuming in the case of large number of variables than the method based on Nystrom method or Lanczos process.