期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Application of Particle Swarm Optimization to Fault Condition Recognition Based on Kernel Principal Component Analysis 被引量:1
1
作者 WEI Xiu-ye PAN Hong-xia HUANG Jin-ying WANG Fu-jie 《International Journal of Plant Engineering and Management》 2009年第3期129-135,共7页
Panicle swarm optimization (PSO) is an optimization algorithm based on the swarm intelligent principle. In this paper the modified PSO is applied to a kernel principal component analysis ( KPCA ) for an optimal ke... Panicle swarm optimization (PSO) is an optimization algorithm based on the swarm intelligent principle. In this paper the modified PSO is applied to a kernel principal component analysis ( KPCA ) for an optimal kernel function parameter. We first comprehensively considered within-class scatter and between-class scatter of the sample features. Then, the fitness function of an optimized kernel function parameter is constructed, and the particle swarm optimization algorithm with adaptive acceleration (CPSO) is applied to optimizing it. It is used for gearbox condi- tion recognition, and the result is compared with the recognized results based on principal component analysis (PCA). The results show that KPCA optimized by CPSO can effectively recognize fault conditions of the gearbox by reducing bind set-up of the kernel function parameter, and its results of fault recognition outperform those of PCA. We draw the conclusion that KPCA based on CPSO has an advantage in nonlinear feature extraction of mechanical failure, and is helpful for fault condition recognition of complicated machines. 展开更多
关键词 particle swarm optimization kernel principal component analysis kernel function parameter feature extraction gearbox condition recognition
下载PDF
Extremal optimization for optimizing kernel function and its parameters in support vector regression 被引量:1
2
作者 Peng CHEN Yong-zai LU 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2011年第4期297-306,共10页
The performance of the support vector regression (SVR) model is sensitive to the kernel type and its parameters.The determination of an appropriate kernel type and the associated parameters for SVR is a challenging re... The performance of the support vector regression (SVR) model is sensitive to the kernel type and its parameters.The determination of an appropriate kernel type and the associated parameters for SVR is a challenging research topic in the field of support vector learning.In this study,we present a novel method for simultaneous optimization of the SVR kernel function and its parameters,formulated as a mixed integer optimization problem and solved using the recently proposed heuristic 'extremal optimization (EO)'.We present the problem formulation for the optimization of the SVR kernel and parameters,the EO-SVR algorithm,and experimental tests with five benchmark regression problems.The results of comparison with other traditional approaches show that the proposed EO-SVR method provides better generalization performance by successfully identifying the optimal SVR kernel function and its parameters. 展开更多
关键词 Support vector regression (SVR) Extremal optimization (EO) Parameter optimization kernel function optimization
原文传递
Kernel Function-Based Primal-Dual Interior-Point Methods for Symmetric Cones Optimization
3
作者 ZHAO Dequan ZHANG Mingwang 《Wuhan University Journal of Natural Sciences》 CAS 2014年第6期461-468,共8页
In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure betwe... In this paper, we present a large-update primal-dual interior-point method for symmetric cone optimization(SCO) based on a new kernel function, which determines both search directions and the proximity measure between the iterate and the center path. The kernel function is neither a self-regular function nor the usual logarithmic kernel function. Besides, by using Euclidean Jordan algebraic techniques, we achieve the favorable iteration complexity O( √r(1/2)(log r)^2 log(r/ ε)), which is as good as the convex quadratic semi-definite optimization analogue. 展开更多
关键词 symmetric cones optimization kernel function Interior-point method polynomial complexity
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部