期刊文献+
共找到6篇文章
< 1 >
每页显示 20 50 100
Legendre Polynomial Kernel: Application in SVM
1
作者 Habib Rebei Nouf S. H. Alharbi 《Journal of Applied Mathematics and Physics》 2022年第5期1732-1747,共16页
In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some u... In machines learning problems, Support Vector Machine is a method of classification. For non-linearly separable data, kernel functions are a basic ingredient in the SVM technic. In this paper, we briefly recall some useful results on decomposition of RKHS. Based on orthogonal polynomial theory and Mercer theorem, we construct the high power Legendre polynomial kernel on the cube [-1,1]<sup>d</sup>. Following presentation of the theoretical background of SVM, we evaluate the performance of this kernel on some illustrative examples in comparison with Rbf, linear and polynomial kernels. 展开更多
关键词 SVM polynomial Legendre kernel Classification Problem Mercer Theorem
下载PDF
Learning rates of least-square regularized regression with polynomial kernels 被引量:3
2
作者 LI BingZheng WANG GuoMao 《Science China Mathematics》 SCIE 2009年第4期687-700,共14页
This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization schem... This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in $ L_{\rho _X }^2 $ with Borel probability measure. 展开更多
关键词 learning theory reproducing kernel Hilbert space polynomial kernel regularization error Bernstein-Durrmeyer operators covering number regularization scheme 68T05 62J02
原文传递
Kernel polynomial representation for imaginary-time Green's functions in continuous-time quantum Monte Carlo impurity solver 被引量:1
3
作者 黄理 《Chinese Physics B》 SCIE EI CAS CSCD 2016年第11期418-423,共6页
Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybr... Inspired by the recently proposed Legendre orthogonal polynomial representation for imaginary-time Green s functions G(τ),we develop an alternate and superior representation for G(τ) and implement it in the hybridization expansion continuous-time quantum Monte Carlo impurity solver.This representation is based on the kernel polynomial method,which introduces some integral kernel functions to filter the numerical fluctuations caused by the explicit truncations of polynomial expansion series and can improve the computational precision significantly.As an illustration of the new representation,we re-examine the imaginary-time Green's functions of the single-band Hubbard model in the framework of dynamical mean-field theory.The calculated results suggest that with carefully chosen integral kernel functions,whether the system is metallic or insulating,the Gibbs oscillations found in the previous Legendre orthogonal polynomial representation have been vastly suppressed and remarkable corrections to the measured Green's functions have been obtained. 展开更多
关键词 polynomial imaginary kernel Legendre impurity solver metallic Gibbs explicit Hubbard
下载PDF
Ensemble kernel method:SVM classification based on game theory 被引量:6
4
作者 Yufei Liu Dechang Pi Qiyou Cheng 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2016年第1期251-259,共9页
With the development of the support vector machine(SVM),the kernel function has become one of the cores of the research on SVM.To a large extent,the kernel function determines the generalization ability of the class... With the development of the support vector machine(SVM),the kernel function has become one of the cores of the research on SVM.To a large extent,the kernel function determines the generalization ability of the classifier,but there is still no general theory to guide the choice and structure of the kernel function.An ensemble kernel function model based on the game theory is proposed,which is used for the SVM classification algorithm.The model can effectively integrate the advantages of the local kernel and the global kernel to get a better classification result,and can provide a feasible way for structuring the kernel function.By making experiments on some standard datasets,it is verified that the new method can significantly improve the accuracy of classification. 展开更多
关键词 game theory classification radial basis kernel polynomial kernel.
下载PDF
Kernel Function-Based Primal-Dual Interior-Point Methods for Symmetric Cones Optimization
5
作者 ZHAO Dequan ZHANG Mingwang 《Wuhan University Journal of Natural Sciences》 CAS 2014年第6期461-468,共8页
In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure betwe... In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure between the iterate and the center path. The kernel function is neither a self-regular function nor the usual logarithmic kernel function. Besides, by using Euclidean Jordan algebraic techniques, we achieve the favorable iteration complexity O( √r(1/2)(log r)^2 log(r/ ε)), which is as good as the convex quadratic semi-definite optimization analogue. 展开更多
关键词 symmetric cones optimization kernel function Interior-point method polynomial complexity
原文传递
Over-Smoothing Algorithm and Its Application to GCN Semi-supervised Classification
6
作者 Mingzhi Dai Weibin Guo Xiang Feng 《国际计算机前沿大会会议论文集》 2020年第2期197-215,共19页
The feature information of the local graph structure and the nodes may be over-smoothing due to the large number of encodings,which causes the node characterization to converge to one or several values.In other words,... The feature information of the local graph structure and the nodes may be over-smoothing due to the large number of encodings,which causes the node characterization to converge to one or several values.In other words,nodes from different clusters become difficult to distinguish,as two different classes of nodes with closer topological distance are more likely to belong to the same class and vice versa.To alleviate this problem,an over-smoothing algorithm is proposed,and a method of reweighted mechanism is applied to make the tradeoff of the information representation of nodes and neighborhoods more reasonable.By improving several propagation models,including Chebyshev polynomial kernel model and Laplace linear 1st Chebyshev kernel model,a new model named RWGCN based on different propagation kernels was proposed logically.The experiments show that satisfactory results are achieved on the semi-supervised classification task of graph type data. 展开更多
关键词 GCN Chebyshev polynomial kernel model Laplace linear 1st Chebyshev kernel model Over-smoothing Reweighted mechanism
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部