In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some u...In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some useful results on decomposition of RKHS. Based on orthogonal polynomial theory and Mercer theorem, we construct the high power Legendre polynomial kernel on the cube [-1,1]<sup>d</sup>. Following presentation of the theoretical background of SVM, we evaluate the performance of this kernel on some illustrative examples in comparison with Rbf, linear and polynomial kernels.展开更多
Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybr...Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybridization expansion continuous-time quantum Monte Carlo impurity solver.This representation is based on the kernel polynomial method,which introduces some integral kernel functions to filter the numerical fluctuations caused by the explicit truncations of polynomial expansion series and can improve the computational precision significantly.As an illustration of the new representation,we re-examine the imaginary-time Green's functions of the single-band Hubbard model in the framework of dynamical mean-field theory.The calculated results suggest that with carefully chosen integral kernel functions,whether the system is metallic or insulating,the Gibbs oscillations found in the previous Legendre orthogonal polynomial representation have been vastly suppressed and remarkable corrections to the measured Green's functions have been obtained.展开更多
文摘In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some useful results on decomposition of RKHS. Based on orthogonal polynomial theory and Mercer theorem, we construct the high power Legendre polynomial kernel on the cube [-1,1]<sup>d</sup>. Following presentation of the theoretical background of SVM, we evaluate the performance of this kernel on some illustrative examples in comparison with Rbf, linear and polynomial kernels.
基金Project supported by the National Natural Science Foundation of China(Grant No.11504340)
文摘Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybridization expansion continuous-time quantum Monte Carlo impurity solver.This representation is based on the kernel polynomial method,which introduces some integral kernel functions to filter the numerical fluctuations caused by the explicit truncations of polynomial expansion series and can improve the computational precision significantly.As an illustration of the new representation,we re-examine the imaginary-time Green's functions of the single-band Hubbard model in the framework of dynamical mean-field theory.The calculated results suggest that with carefully chosen integral kernel functions,whether the system is metallic or insulating,the Gibbs oscillations found in the previous Legendre orthogonal polynomial representation have been vastly suppressed and remarkable corrections to the measured Green's functions have been obtained.