In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some u...In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some useful results on decomposition of RKHS. Based on orthogonal polynomial theory and Mercer theorem, we construct the high power Legendre polynomial kernel on the cube [-1,1]<sup>d</sup>. Following presentation of the theoretical background of SVM, we evaluate the performance of this kernel on some illustrative examples in comparison with Rbf, linear and polynomial kernels.展开更多
This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization schem...This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in $ L_{\rho _X }^2 $ with Borel probability measure.展开更多
Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybr...Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybridization expansion continuous-time quantum Monte Carlo impurity solver.This representation is based on the kernel polynomial method,which introduces some integral kernel functions to filter the numerical fluctuations caused by the explicit truncations of polynomial expansion series and can improve the computational precision significantly.As an illustration of the new representation,we re-examine the imaginary-time Green's functions of the single-band Hubbard model in the framework of dynamical mean-field theory.The calculated results suggest that with carefully chosen integral kernel functions,whether the system is metallic or insulating,the Gibbs oscillations found in the previous Legendre orthogonal polynomial representation have been vastly suppressed and remarkable corrections to the measured Green's functions have been obtained.展开更多
With the development of the support vector machine(SVM),the kernel function has become one of the cores of the research on SVM.To a large extent,the kernel function determines the generalization ability of the class...With the development of the support vector machine(SVM),the kernel function has become one of the cores of the research on SVM.To a large extent,the kernel function determines the generalization ability of the classifier,but there is still no general theory to guide the choice and structure of the kernel function.An ensemble kernel function model based on the game theory is proposed,which is used for the SVM classification algorithm.The model can effectively integrate the advantages of the local kernel and the global kernel to get a better classification result,and can provide a feasible way for structuring the kernel function.By making experiments on some standard datasets,it is verified that the new method can significantly improve the accuracy of classification.展开更多
In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure betwe...In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure between the iterate and the center path. The kernel function is neither a self-regular function nor the usual logarithmic kernel function. Besides, by using Euclidean Jordan algebraic techniques, we achieve the favorable iteration complexity O( √r(1/2)(log r)^2 log(r/ ε)), which is as good as the convex quadratic semi-definite optimization analogue.展开更多
The feature information of the local graph structure and the nodes may be over-smoothing due to the large number of encodings,which causes the node characterization to converge to one or several values.In other words,...The feature information of the local graph structure and the nodes may be over-smoothing due to the large number of encodings,which causes the node characterization to converge to one or several values.In other words,nodes from different clusters become difficult to distinguish,as two different classes of nodes with closer topological distance are more likely to belong to the same class and vice versa.To alleviate this problem,an over-smoothing algorithm is proposed,and a method of reweighted mechanism is applied to make the tradeoff of the information representation of nodes and neighborhoods more reasonable.By improving several propagation models,including Chebyshev polynomial kernel model and Laplace linear 1st Chebyshev kernel model,a new model named RWGCN based on different propagation kernels was proposed logically.The experiments show that satisfactory results are achieved on the semi-supervised classification task of graph type data.展开更多
文摘In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some useful results on decomposition of RKHS. Based on orthogonal polynomial theory and Mercer theorem, we construct the high power Legendre polynomial kernel on the cube [-1,1]<sup>d</sup>. Following presentation of the theoretical background of SVM, we evaluate the performance of this kernel on some illustrative examples in comparison with Rbf, linear and polynomial kernels.
文摘This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in $ L_{\rho _X }^2 $ with Borel probability measure.
基金Project supported by the National Natural Science Foundation of China(Grant No.11504340)
文摘Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybridization expansion continuous-time quantum Monte Carlo impurity solver.This representation is based on the kernel polynomial method,which introduces some integral kernel functions to filter the numerical fluctuations caused by the explicit truncations of polynomial expansion series and can improve the computational precision significantly.As an illustration of the new representation,we re-examine the imaginary-time Green's functions of the single-band Hubbard model in the framework of dynamical mean-field theory.The calculated results suggest that with carefully chosen integral kernel functions,whether the system is metallic or insulating,the Gibbs oscillations found in the previous Legendre orthogonal polynomial representation have been vastly suppressed and remarkable corrections to the measured Green's functions have been obtained.
基金supported by the National Natural Science Foundation of China(U1433116)the Aviation Science Foundation of China(20145752033)the Graduate Innovation Project of Jiangsu Province(KYLX15_0324)
文摘With the development of the support vector machine(SVM),the kernel function has become one of the cores of the research on SVM.To a large extent,the kernel function determines the generalization ability of the classifier,but there is still no general theory to guide the choice and structure of the kernel function.An ensemble kernel function model based on the game theory is proposed,which is used for the SVM classification algorithm.The model can effectively integrate the advantages of the local kernel and the global kernel to get a better classification result,and can provide a feasible way for structuring the kernel function.By making experiments on some standard datasets,it is verified that the new method can significantly improve the accuracy of classification.
基金Supported by the Natural Science Foundation of Hubei Province(2008CDZD47)
文摘In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure between the iterate and the center path. The kernel function is neither a self-regular function nor the usual logarithmic kernel function. Besides, by using Euclidean Jordan algebraic techniques, we achieve the favorable iteration complexity O( √r(1/2)(log r)^2 log(r/ ε)), which is as good as the convex quadratic semi-definite optimization analogue.
文摘The feature information of the local graph structure and the nodes may be over-smoothing due to the large number of encodings,which causes the node characterization to converge to one or several values.In other words,nodes from different clusters become difficult to distinguish,as two different classes of nodes with closer topological distance are more likely to belong to the same class and vice versa.To alleviate this problem,an over-smoothing algorithm is proposed,and a method of reweighted mechanism is applied to make the tradeoff of the information representation of nodes and neighborhoods more reasonable.By improving several propagation models,including Chebyshev polynomial kernel model and Laplace linear 1st Chebyshev kernel model,a new model named RWGCN based on different propagation kernels was proposed logically.The experiments show that satisfactory results are achieved on the semi-supervised classification task of graph type data.