期刊文献+
共找到490篇文章
< 1 2 25 >
每页显示 20 50 100
Improved adaptive pruning algorithm for least squares support vector regression 被引量:4
1
作者 Runpeng Gao Ye San 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2012年第3期438-444,共7页
As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit... As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance. 展开更多
关键词 least squares support vector regression machine (LS- SVRM) PRUNING leave-one-out (LOO) error incremental learning decremental learning.
下载PDF
Flatness intelligent control via improved least squares support vector regression algorithm 被引量:2
2
作者 张秀玲 张少宇 +1 位作者 赵文保 徐腾 《Journal of Central South University》 SCIE EI CAS 2013年第3期688-695,共8页
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w... To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method. 展开更多
关键词 least squares support vector regression multi-output least squares support vector regression FLATNESS effective matrix predictive control
下载PDF
Fault diagnosis of power-shift steering transmission based on multiple outputs least squares support vector regression 被引量:2
3
作者 张英锋 马彪 +2 位作者 房京 张海岭 范昱珩 《Journal of Beijing Institute of Technology》 EI CAS 2011年第2期199-204,共6页
A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict t... A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict the future state of the power-shift steering transmission (PSST). A prediction model of PSST was gotten with multiple outputs LS-SVR. The model performance was greatly influenced by the penalty parameter γ and kernel parameter σ2 which were optimized using cross validation method. The training and prediction of the model were done with spectrometric oil analysis data. The predictive and actual values were compared and a fault in the second PSST was found. The research proved that this method had good accuracy in PSST fault prediction, and any possible problem in PSST could be found through a comparative analysis. 展开更多
关键词 least squares support vector regressionls-svr fault diagnosis power-shift steering transmission (PSST)
下载PDF
Improved Scheme for Fast Approximation to Least Squares Support Vector Regression
4
作者 张宇宸 赵永平 +3 位作者 宋成俊 侯宽新 脱金奎 叶小军 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2014年第4期413-419,共7页
The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FS... The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR. 展开更多
关键词 support vector regression kernel method least squares SPARSENESS
下载PDF
Improved scheme to accelerate sparse least squares support vector regression
5
作者 Yongping Zhao Jianguo Sun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第2期312-317,共6页
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p... The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem. 展开更多
关键词 least squares support vector regression machine pruning algorithm iterative methodology classification.
下载PDF
A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking 被引量:2
6
作者 HU Lei YI Guoxing HUANG Chao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2021年第1期151-162,共12页
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a... Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance. 展开更多
关键词 least square support vector regression(LSSVR) global representative point ranking(GRPR) initial training dataset pruning strategy sparsity regression accuracy
下载PDF
Application of Least Squares Support Vector Machine for Regression to Reliability Analysis 被引量:18
7
作者 郭秩维 白广忱 《Chinese Journal of Aeronautics》 SCIE EI CAS CSCD 2009年第2期160-166,共7页
In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functiona... In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functional relationship between the state variable and basic variables in reliability design. The algorithm has treated successfully some problems of implicit performance function in reliability analysis. However, its theoretical basis of empirical risk minimization narrows its range of applications for... 展开更多
关键词 mechanism design of spacecraft support vector machine for regression least squares support vector machine for regression Monte Carlo method RELIABILITY implicit performance function
原文传递
Primal least squares twin support vector regression 被引量:5
8
作者 Hua-juan HUANG Shi-fei DING Zhong-zhi SHI 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2013年第9期722-732,共11页
The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this... The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this solution is affected by time and memory constraints when dealing with large datasets.In this paper,we present a least squares version for TSVR in the primal space,termed primal least squares TSVR (PLSTSVR).By introducing the least squares method,the inequality constraints of TSVR are transformed into equality constraints.Furthermore,we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space;thus,we need only to solve two systems of linear equations instead of two QPPs.Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time.We further investigate its validity in predicting the opening price of stock. 展开更多
关键词 Twin support vector regression Least squares method Primal space Stock prediction
原文传递
A Novel Method for Flatness Pattern Recognition via Least Squares Support Vector Regression 被引量:12
9
作者 ZHANG Xiu-ling, ZHANG Shao-yu, TAN Guang-zhong, ZHAO Wen-bao (Key Laboratory of Industrial Computer Control Engineering of Hebei Province, National Engineering Research Center for Equipment and Technology of Cold Strip Rolling, Yanshan University, Qinhuangdao 066004, Hebei, China) 《Journal of Iron and Steel Research International》 SCIE EI CAS CSCD 2012年第3期25-30,共6页
To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, q... To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, quadratic, cubic and quartic Legendre orthogonal polynomials were adopted to express the flatness basic patterns. In order to over- come the defects live in the existent recognition methods based on fuzzy, neural network and support vector regres- sion (SVR) theory, a novel flatness pattern recognition method based on least squares support vector regression (LS-SVR) was proposed. On this basis, for the purpose of determining the hyper-parameters of LS-SVR effectively and enhan- cing the recognition accuracy and generalization performance of the model, particle swarm optimization algorithm with leave-one-out (LOO) error as fitness function was adopted. To overcome the disadvantage of high computational complexity of naive cross-validation algorithm, a novel fast cross-validation algorithm was introduced to calculate the LOO error of LDSVR. Results of experiments on flatness data calculated by theory and a 900HC cold-rolling mill practically measured flatness signals demonstrate that the proposed approach can distinguish the types and define the magnitudes of the flatness defects effectively with high accuracy, high speed and strong generalization ability. 展开更多
关键词 flatness pattern recognition least squares support vector regression cross-validation
原文传递
Short Term Electric Load Prediction by Incorporation of Kernel into Features Extraction Regression Technique
10
作者 Ruaa Mohamed-Rashad Ghandour Jun Li 《Smart Grid and Renewable Energy》 2017年第1期31-45,共15页
Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a rea... Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a reasonable prediction, authors have applied and compared two features extraction technique presented by kernel partial least square regression and kernel principal component regression, and both of them are carried out by polynomial and Gaussian kernels to map the original features’ to high dimension features’ space, and then draw new predictor variables known as scores and loadings, while kernel principal component regression draws the predictor features to construct new predictor variables without any consideration to response vector. In contrast, kernel partial least square regression does take the response vector into consideration. Models are simulated by three different cities’ electric load data, which used historical load data in addition to weekends and holidays as common predictor features for all models. On the other hand temperature has been used for only one data as a comparative study to measure its effect. Models’ results evaluated by three statistic measurements, show that Gaussian Kernel Partial Least Square Regression offers the more powerful features and significantly can improve the load prediction performance than other presented models. 展开更多
关键词 Short TERM Load PREDICTION support vector regression (SVR) KERNEL Principal Component regression (KPCR) KERNEL PARTIAL Least squarE regression (KPLSR)
下载PDF
基于LS-SVR岩石爆破块度预测 被引量:12
11
作者 史秀志 王洋 +1 位作者 黄丹 史采星 《爆破》 CSCD 北大核心 2016年第3期36-40,共5页
为了准确预测小样本条件下露天矿山岩石的爆破块度,并得到小样本条件下预测露天矿山爆破块度的有效方法,借助最小二乘支持向量机工具(LS-SVMlab)构建基于最小二乘支持向量机回归(LS-SVR)预测模型并合理优化模型参数。分别使用15组露天... 为了准确预测小样本条件下露天矿山岩石的爆破块度,并得到小样本条件下预测露天矿山爆破块度的有效方法,借助最小二乘支持向量机工具(LS-SVMlab)构建基于最小二乘支持向量机回归(LS-SVR)预测模型并合理优化模型参数。分别使用15组露天矿山爆破数据和35组爆破数据作为小样本容量和正常样本容量,对模型的预测精度进行检验。结果表明:两种样本容量下LS-SVR预测模型的预测结果精度都比同样本容量下人工神经网络(ANN)回归预测的结果精度更高,说明所提出的LS-SVR模型适用于预测露天矿山爆破块度,并且在小样本条件下更具优势。 展开更多
关键词 支持向量机 最小二乘支持向量机回归 LS-SVMlab 岩石块度 小样本预测
下载PDF
基于LS-SVR的机器人空间4DOF无标定视觉定位 被引量:7
12
作者 辛菁 刘丁 徐庆坤 《控制理论与应用》 EI CAS CSCD 北大核心 2010年第1期77-85,共9页
研究了基于智能算法的机器人无标定视觉伺服问题,提出了一种新的基于最小二乘支持向量回归的机器人无标定视觉免疫控制方法.利用最小二乘支持向量回归学习机器人位姿变化和观测到的图像特征变化之间的复杂非线性关系,其中最小二乘支持... 研究了基于智能算法的机器人无标定视觉伺服问题,提出了一种新的基于最小二乘支持向量回归的机器人无标定视觉免疫控制方法.利用最小二乘支持向量回归学习机器人位姿变化和观测到的图像特征变化之间的复杂非线性关系,其中最小二乘支持向量回归的参数由自适应免疫算法加5折交叉检验优化确定,在此基础上利用免疫控制原理设计了视觉控制器.六自由度工业机器人空间4DOF视觉定位实验结果表明了该方法的有效性. 展开更多
关键词 无标定 视觉定位 最小二乘支持向量回归 免疫控制
下载PDF
基于特征量重要度LS-SVR的WSN定位方法 被引量:5
13
作者 刘桂雄 周松斌 +1 位作者 张晓平 洪晓斌 《华南理工大学学报(自然科学版)》 EI CAS CSCD 北大核心 2008年第10期102-107,共6页
针对无线传感器网络(WSN)节点定位方法中采用粗测距技术时,节点间较大的测距误差导致定位准确度不足的问题,提出一种基于特征量重要度最小二乘支持向量回归(LS-SVR)的定位方法.该方法把未知节点到锚节点的距离作为特征量,依据... 针对无线传感器网络(WSN)节点定位方法中采用粗测距技术时,节点间较大的测距误差导致定位准确度不足的问题,提出一种基于特征量重要度最小二乘支持向量回归(LS-SVR)的定位方法.该方法把未知节点到锚节点的距离作为特征量,依据特征量的重要度进行特征提取,通过对探测区域网格化采样得到训练样本集,使用LS-SVR学习得到定位模型;在定位阶段,将未知节点的特征向量输入定位模型,利用LS—SVR良好的泛化能力实现对未知节点的准确定位.对均匀分布和C形区域随机分布的100个节点的定位实验表明,文中提出的定位方法能有效地降低测距误差对定位准确度的影响,减小平均定位误差;与采用相同测距技术的DV—Hop方法相比,均匀分布情况下该方法的平均定位误差减小7.5%~14.0%,C形区域随机分布情况下显著减小36.5%~55.2%. 展开更多
关键词 特征提取 最小二乘支持向量回归机 无线传感器网络 定位
下载PDF
基于新息的多参量混沌时间序列LS-SVR加权预测 被引量:5
14
作者 郭阳明 翟正军 姜红梅 《西北工业大学学报》 EI CAS CSCD 北大核心 2009年第1期83-87,共5页
复杂系统常常依赖于通过观测所获得的多参量混沌时间序列进行预测分析。论文借鉴单参量混沌时间序列预测的思路,考虑全部相关参量混沌时间序列中的信息,以实现多参量混沌时间序列的相空间重构。同时,基于新息优先原理和支持向量机理论,... 复杂系统常常依赖于通过观测所获得的多参量混沌时间序列进行预测分析。论文借鉴单参量混沌时间序列预测的思路,考虑全部相关参量混沌时间序列中的信息,以实现多参量混沌时间序列的相空间重构。同时,基于新息优先原理和支持向量机理论,结合混沌时间序列发展变化的规律,提出分别利用相空间重构后长期多样本和近期少样本构建2个自适应最小二乘支持向量回归预测模型进行加权预测的观点,并给出了以预测均方根误差最小为目标函数的模型参数混沌优化方法。论文以某飞机转子部件磨损故障的3个相关参量的仿真混沌时间序列为例进行了预测实验,结果表明文中方法有较好的预测精度,是一种有效的预测方法。 展开更多
关键词 支持向量机 多参量 混沌时间序列 最小二乘支持向量回归 加权预测
下载PDF
基于小波变换与LS-SVR的柑橘叶片磷含量高光谱监测模型 被引量:2
15
作者 黄双萍 岳学军 +2 位作者 洪添胜 蔡坤 林诗伦 《广东农业科学》 CAS CSCD 北大核心 2013年第13期37-40,共4页
快捷、准确、无损地监测柑橘磷(P)含量,对柑橘树磷肥的精准喷施及动态管理有重大意义。高光谱技术的快速发展使柑橘磷含量的快速无损监测成为可能。以117株园栽萝岗橙为试验对象,分别在壮果促梢期和采果期两个不同发育阶段采集234个样... 快捷、准确、无损地监测柑橘磷(P)含量,对柑橘树磷肥的精准喷施及动态管理有重大意义。高光谱技术的快速发展使柑橘磷含量的快速无损监测成为可能。以117株园栽萝岗橙为试验对象,分别在壮果促梢期和采果期两个不同发育阶段采集234个样本数据,高光谱反射数据构成描述样本的多元矢量,硫酸-双氧水消煮-钼锑抗比色法测得的磷含量值作为样本标签值。在对高光谱反射数据小波去噪的基础上,用LS-SVR算法建立柑橘叶片磷含量监测模型。模型分别在验证集和校正集上进行评估,分别取得模型决定系数0.907和0.953,均方误差0.004和0.002,平均相对误差2.76%和1.77%。结果表明:用高光谱技术进行柑橘叶片磷含量监测是可行的。 展开更多
关键词 柑橘叶片 磷含量 高光谱 小波去噪 最小二乘支持向量回归
下载PDF
基于LS-SVR的图像噪声去除算法研究 被引量:3
16
作者 于忠党 王龙山 《自动化学报》 EI CSCD 北大核心 2009年第4期364-370,共7页
通过对最小二乘支持向量机(Least squares support vector regression,LS-SVR)滤波特性的分析,给出了LS-SVR用于图像滤波的卷积模板构造方法,解决了LS-SVR在应用中需要求解的问题,在此基础上,提出了基于LS-SVR的开关型椒盐噪声滤波算法... 通过对最小二乘支持向量机(Least squares support vector regression,LS-SVR)滤波特性的分析,给出了LS-SVR用于图像滤波的卷积模板构造方法,解决了LS-SVR在应用中需要求解的问题,在此基础上,提出了基于LS-SVR的开关型椒盐噪声滤波算法.滤波算法中以Maximum-minimum算子作为椒盐噪声检测器,利用滤波窗口内非噪声点构成LS-SVR的输入数据,使用事先构造出的LS-SVR滤波算子,对滤波窗口进行简单的卷积运算,实现了被椒盐噪声污染点数据的有效恢复,实验表明,本文提出的方法具有较好的细节保护能力和较强的噪声去除能力. 展开更多
关键词 图像滤波 最小二乘支持向量机 开关滤波 卷积算子
下载PDF
基于LS-SVR的图像矫正 被引量:2
17
作者 祝振敏 吕兆康 刘百芬 《大连理工大学学报》 EI CAS CSCD 北大核心 2016年第1期86-91,共6页
最小二乘支持向量回归(the least squares support vector regression,LS-SVR)算法因其回归拟合度高广泛应用于各领域中.以目标物在不同光源下采集的图像呈现出不同的颜色值,从而导致图像与目标物出现视觉上的偏差为研究对象,并以潘通... 最小二乘支持向量回归(the least squares support vector regression,LS-SVR)算法因其回归拟合度高广泛应用于各领域中.以目标物在不同光源下采集的图像呈现出不同的颜色值,从而导致图像与目标物出现视觉上的偏差为研究对象,并以潘通色卡为参照,利用LSSVR算法,结合将RGB颜色空间到sRGB颜色空间的转换模型,对测试图像进行矫正处理.实验结果表明:与多项式回归相比,LS-SVR算法能取得更小的色差,且矫正后的图像更接近于目标图像. 展开更多
关键词 颜色空间 最小二乘支持向量回归(ls-svr) 图像矫正 色差
下载PDF
基于LS-SVR的混合定位算法 被引量:1
18
作者 夏斌 梁春燕 +1 位作者 袁文浩 谢楠 《计算机工程与设计》 北大核心 2018年第11期3318-3321,3339,共5页
为解决最小二乘支持向量回归(least-square support vector regression,LS-SVR)定位精度不高的问题,提出基于LS-SVR的混合定位算法,充分考虑未知节点之间的距离信息在定位过程中的有效修正作用。通过LS-SVR算法提供初始值,提高多元Taylo... 为解决最小二乘支持向量回归(least-square support vector regression,LS-SVR)定位精度不高的问题,提出基于LS-SVR的混合定位算法,充分考虑未知节点之间的距离信息在定位过程中的有效修正作用。通过LS-SVR算法提供初始值,提高多元Taylor级数展开法的收敛速度;通过多元Taylor级数展开法,充分利用未知节点之间的距离信息,减小测距误差造成的定位误差。仿真结果表明,与传统LS-SVR定位算法相比,混合定位算法的精度更高,减少了正则化参数和核参数的选取对定位精度的影响。 展开更多
关键词 多元泰勒级数展开 定位模型 最小二乘支持向量回归 定位精度 混合算法
下载PDF
一种基于GA优化小波LS-SVR的实时寿命预测方法 被引量:2
19
作者 胡友涛 胡昌华 《南京航空航天大学学报》 EI CAS CSCD 北大核心 2011年第B07期203-206,共4页
针对性能非线性退化的产品,从研究退化轨迹相似性的角度出发,提出一种基于遗传算法(GA)优化小波最小二乘支持向量回归机(LS-SVR)的实时退化轨迹建模和寿命预测方法。该方法根据特定个体与同类产品的Euclid距离确定隶属度权值,加权小波LS... 针对性能非线性退化的产品,从研究退化轨迹相似性的角度出发,提出一种基于遗传算法(GA)优化小波最小二乘支持向量回归机(LS-SVR)的实时退化轨迹建模和寿命预测方法。该方法根据特定个体与同类产品的Euclid距离确定隶属度权值,加权小波LS-SVR建立的同类产品退化模型得到特定个体的退化轨迹模型,结合实测数据更新模型并进行实时寿命预测。实例分析验证了所提方法的有效性。 展开更多
关键词 实时寿命预测 性能退化 最小二乘支持向量回归机 小波核函数 遗传算法
下载PDF
改进LS-SVR在线控转向系统容错控制中的应用 被引量:1
20
作者 吴方圆 孔峰 姚江云 《计算机工程与应用》 CSCD 2013年第12期237-241,共5页
针对现阶段容错技术中对于不可直接测量变量往往采用易受扰动影响的观测器这一缺点,提出一种基于鱼群算法优化的最小二乘支持向量回归机(LS-SVR)方法用于代替传统的观测器。该方法利用鱼群算法迭代求解LS-SVR中出现的矩阵方程,从而避免... 针对现阶段容错技术中对于不可直接测量变量往往采用易受扰动影响的观测器这一缺点,提出一种基于鱼群算法优化的最小二乘支持向量回归机(LS-SVR)方法用于代替传统的观测器。该方法利用鱼群算法迭代求解LS-SVR中出现的矩阵方程,从而避免了矩阵求逆过程,减少了LS-SVR算法的训练时间,并且能取得最优解。将LS-SVR应用于容错控制中的质心侧偏角估计,一个训练好的LS-SVR包含了质心侧偏角的冗余信息,可以代替观测器进行估计输出。通过仿真实验表明,所提方法收敛速度快,抗干扰能力强,效果明显提升。 展开更多
关键词 鱼群算法 最小二乘支持向量回归机 线控转向 容错控制
下载PDF
上一页 1 2 25 下一页 到第
使用帮助 返回顶部