期刊文献+
共找到501篇文章
< 1 2 26 >
每页显示 20 50 100
Improved adaptive pruning algorithm for least squares support vector regression 被引量:4
1
作者 Runpeng Gao Ye San 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2012年第3期438-444,共7页
As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit... As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance. 展开更多
关键词 least squares support vector regression machine (LS- SVRM) PRUNING leave-one-out (LOO) error incremental learning decremental learning.
下载PDF
Flatness intelligent control via improved least squares support vector regression algorithm 被引量:2
2
作者 张秀玲 张少宇 +1 位作者 赵文保 徐腾 《Journal of Central South University》 SCIE EI CAS 2013年第3期688-695,共8页
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w... To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method. 展开更多
关键词 least squares support vector regression multi-output least squares support vector regression FLATNESS effective matrix predictive control
下载PDF
Fault diagnosis of power-shift steering transmission based on multiple outputs least squares support vector regression 被引量:2
3
作者 张英锋 马彪 +2 位作者 房京 张海岭 范昱珩 《Journal of Beijing Institute of Technology》 EI CAS 2011年第2期199-204,共6页
A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict t... A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict the future state of the power-shift steering transmission (PSST). A prediction model of PSST was gotten with multiple outputs LS-SVR. The model performance was greatly influenced by the penalty parameter γ and kernel parameter σ2 which were optimized using cross validation method. The training and prediction of the model were done with spectrometric oil analysis data. The predictive and actual values were compared and a fault in the second PSST was found. The research proved that this method had good accuracy in PSST fault prediction, and any possible problem in PSST could be found through a comparative analysis. 展开更多
关键词 least squares support vector regression(LS-SVR) fault diagnosis power-shift steering transmission (PSST)
下载PDF
Improved Scheme for Fast Approximation to Least Squares Support Vector Regression
4
作者 张宇宸 赵永平 +3 位作者 宋成俊 侯宽新 脱金奎 叶小军 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2014年第4期413-419,共7页
The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FS... The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR. 展开更多
关键词 support vector regression kernel method least squares SPARSENESS
下载PDF
Improved scheme to accelerate sparse least squares support vector regression
5
作者 Yongping Zhao Jianguo Sun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第2期312-317,共6页
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p... The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem. 展开更多
关键词 least squares support vector regression machine pruning algorithm iterative methodology classification.
下载PDF
A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking 被引量:2
6
作者 HU Lei YI Guoxing HUANG Chao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2021年第1期151-162,共12页
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a... Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance. 展开更多
关键词 least square support vector regression(LSSVR) global representative point ranking(GRPR) initial training dataset pruning strategy sparsity regression accuracy
下载PDF
Application of Least Squares Support Vector Machine for Regression to Reliability Analysis 被引量:18
7
作者 郭秩维 白广忱 《Chinese Journal of Aeronautics》 SCIE EI CAS CSCD 2009年第2期160-166,共7页
In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functiona... In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functional relationship between the state variable and basic variables in reliability design. The algorithm has treated successfully some problems of implicit performance function in reliability analysis. However, its theoretical basis of empirical risk minimization narrows its range of applications for... 展开更多
关键词 mechanism design of spacecraft support vector machine for regression least squares support vector machine for regression Monte Carlo method RELIABILITY implicit performance function
原文传递
Primal least squares twin support vector regression 被引量:5
8
作者 Hua-juan HUANG Shi-fei DING Zhong-zhi SHI 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2013年第9期722-732,共11页
The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this... The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this solution is affected by time and memory constraints when dealing with large datasets.In this paper,we present a least squares version for TSVR in the primal space,termed primal least squares TSVR (PLSTSVR).By introducing the least squares method,the inequality constraints of TSVR are transformed into equality constraints.Furthermore,we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space;thus,we need only to solve two systems of linear equations instead of two QPPs.Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time.We further investigate its validity in predicting the opening price of stock. 展开更多
关键词 Twin support vector regression least squares method Primal space Stock prediction
原文传递
A Novel Method for Flatness Pattern Recognition via Least Squares Support Vector Regression 被引量:12
9
作者 ZHANG Xiu-ling, ZHANG Shao-yu, TAN Guang-zhong, ZHAO Wen-bao (Key Laboratory of Industrial Computer Control Engineering of Hebei Province, National Engineering Research Center for Equipment and Technology of Cold Strip Rolling, Yanshan University, Qinhuangdao 066004, Hebei, China) 《Journal of Iron and Steel Research International》 SCIE EI CAS CSCD 2012年第3期25-30,共6页
To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, q... To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, quadratic, cubic and quartic Legendre orthogonal polynomials were adopted to express the flatness basic patterns. In order to over- come the defects live in the existent recognition methods based on fuzzy, neural network and support vector regres- sion (SVR) theory, a novel flatness pattern recognition method based on least squares support vector regression (LS-SVR) was proposed. On this basis, for the purpose of determining the hyper-parameters of LS-SVR effectively and enhan- cing the recognition accuracy and generalization performance of the model, particle swarm optimization algorithm with leave-one-out (LOO) error as fitness function was adopted. To overcome the disadvantage of high computational complexity of naive cross-validation algorithm, a novel fast cross-validation algorithm was introduced to calculate the LOO error of LDSVR. Results of experiments on flatness data calculated by theory and a 900HC cold-rolling mill practically measured flatness signals demonstrate that the proposed approach can distinguish the types and define the magnitudes of the flatness defects effectively with high accuracy, high speed and strong generalization ability. 展开更多
关键词 flatness pattern recognition least squares support vector regression cross-validation
原文传递
Short Term Electric Load Prediction by Incorporation of Kernel into Features Extraction Regression Technique
10
作者 Ruaa Mohamed-Rashad Ghandour Jun Li 《Smart Grid and Renewable Energy》 2017年第1期31-45,共15页
Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a rea... Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a reasonable prediction, authors have applied and compared two features extraction technique presented by kernel partial least square regression and kernel principal component regression, and both of them are carried out by polynomial and Gaussian kernels to map the original features’ to high dimension features’ space, and then draw new predictor variables known as scores and loadings, while kernel principal component regression draws the predictor features to construct new predictor variables without any consideration to response vector. In contrast, kernel partial least square regression does take the response vector into consideration. Models are simulated by three different cities’ electric load data, which used historical load data in addition to weekends and holidays as common predictor features for all models. On the other hand temperature has been used for only one data as a comparative study to measure its effect. Models’ results evaluated by three statistic measurements, show that Gaussian Kernel Partial Least Square Regression offers the more powerful features and significantly can improve the load prediction performance than other presented models. 展开更多
关键词 Short TERM Load PREDICTION support vector regression (SVR) KERNEL Principal Component regression (KPCR) KERNEL PARTIAL least SQUARE regression (KPLSR)
下载PDF
基于高光谱成像技术的涌泉蜜桔糖度最优检测位置 被引量:1
11
作者 李斌 万霞 +4 位作者 刘爱伦 邹吉平 卢英俊 姚迟 刘燕德 《中国光学(中英文)》 EI CAS CSCD 北大核心 2024年第1期128-139,共12页
本文旨在探索涌泉蜜桔糖度的最优检测位置和最佳预测模型,以便为蜜桔糖度检测分级提供理论依据。本文利用波长为390.2~981.3 nm的高光谱成像系统对涌泉蜜桔糖度最佳检测位置进行研究,将涌泉蜜桔的花萼、果茎、赤道和全局的光谱信息与其... 本文旨在探索涌泉蜜桔糖度的最优检测位置和最佳预测模型,以便为蜜桔糖度检测分级提供理论依据。本文利用波长为390.2~981.3 nm的高光谱成像系统对涌泉蜜桔糖度最佳检测位置进行研究,将涌泉蜜桔的花萼、果茎、赤道和全局的光谱信息与其对应部位的糖度结合,建立其预测模型。使用标准正态变量变换(SNV)、多元散射校正(MSC)、基线校准(Baseline)和SG平滑(Savitzkv-Golay)4种预处理方法对不同部位的原始光谱进行预处理,用预处理后的光谱数据建立偏最小二乘回归(PLSR)和最小二乘支持向量机(LSSVM)模型。找出蜜桔不同部位的最佳预处理方式,对经过最佳预处理后的光谱数据采用竞争性自适应重加权算法(CARS)和无信息变量消除法(UVE)进行特征波长筛选。最后,用筛选后的光谱数据建立PLSR和LSSVM模型并进行分析比较。研究结果表明,全局的MSC-CARS-LSSVM模型预测效果最佳,其预测集相关系数Rp=0.955,均方根误差RMSEP=0.395,其次是蜜桔赤道部位的SNV-PLSR模型,其预测集相关系数Rp=0.936,均方根误差RMSEP=0.37。两者预测集相关系数相近,因此可将赤道位置作为蜜桔糖度的最优检测位置。本研究表明根据蜜桔不同部位建立的糖度预测模型的预测效果有所差异,研究最优检测位置和最佳预测模型可以为蜜桔进行糖度检测分级提供理论依据。 展开更多
关键词 涌泉蜜桔 高光谱 糖度 偏最小二乘回归 最小二乘支持向量机
下载PDF
拉曼光谱对茶油三元体系掺伪检测研究
12
作者 郭佳 郭郁葱 +1 位作者 姜红 李开开 《食品与发酵工业》 CAS CSCD 北大核心 2024年第22期327-333,共7页
该研究采用拉曼光谱技术对茶油三元体系掺伪进行定量检测研究,通过对比不同预处理方法、建模方法及优化算法的优劣,确定最优的大豆油、玉米油、茶油的多元掺伪检测模型。利用一阶微分、二阶微分、多元散射矫正、标准正态变换等不同预处... 该研究采用拉曼光谱技术对茶油三元体系掺伪进行定量检测研究,通过对比不同预处理方法、建模方法及优化算法的优劣,确定最优的大豆油、玉米油、茶油的多元掺伪检测模型。利用一阶微分、二阶微分、多元散射矫正、标准正态变换等不同预处理方法消除外界因素对光谱的影响,竞争性自适应重加权算法提取特征光谱波段,通过偏最小二乘回归和支持向量机建立茶油掺伪检测模型,分别采用网格搜索法和粒子群算法对支持向量机进行优化。基于标准正态变换预处理后所建立模型效果最佳,大豆油和茶油的最佳预测模型为基于粒子群算法优化的支持向量机,玉米油的最佳预测模型为基于网格搜索法优化的支持向量机,大豆油、玉米油和茶油的预测集决定系数R2和预测均方根误差分别为0.9986、0.9994、0.9999和2.73%、1.62%、0.40%。该研究确定了最优的大豆油、玉米油、茶油的多元掺伪检测模型,针对市场茶油的掺伪检测,基于拉曼光谱分析和优化算法的支持向量机模型为茶油的无损快速定量检测提供了一定的参考和借鉴。 展开更多
关键词 茶油 拉曼光谱 掺伪检测 偏最小二乘回归 粒子群算法优化 支持向量机
下载PDF
基于PLSR和LSSVM模型的土壤水分高光谱反演
13
作者 刘英 范凯旋 +2 位作者 裴为豪 沈文静 葛建华 《矿业安全与环保》 CAS 北大核心 2024年第5期147-153,共7页
为对地下采矿扰动区表层土壤水分进行反演,以大柳塔煤矿52501工作面为例,利用无人机搭载成像光谱仪获取高光谱影像,对获取的光谱数据进行对数、倒数对数、一阶和包络线去除变换,结合地面采集的128个土壤水分数据,基于偏最小二乘回归(PL... 为对地下采矿扰动区表层土壤水分进行反演,以大柳塔煤矿52501工作面为例,利用无人机搭载成像光谱仪获取高光谱影像,对获取的光谱数据进行对数、倒数对数、一阶和包络线去除变换,结合地面采集的128个土壤水分数据,基于偏最小二乘回归(PLSR)和最小二乘支持向量机(LSSVM)构建土壤水分预测模型并验证其预测精度。结果表明,基于一阶变换的PLSR模型和LSSVM模型预测精度相对较好,一阶变换的PLSR模型建模集R^(2)_(c)和预测集R^(2)_(p)分别为0.7021和0.6405,均方根误差RMSE_(c)和RMSE_(p)分别为1.6384%和1.1034%,相对分析误差RPD_(p)为1.7263;一阶变换的LSSVM模型建模集R^(2)_(c)和预测集R^(2)_(p)分别为0.8125和0.5979,均方根误差RMSE_(c)和RMSE_(p)分别为1.2755%和1.3459%,相对分析误差RPD_(P)为1.6323。最终基于PLSR和LSSVM模型完成了土壤水分的制图,实现了土壤水分的空间预测,为该研究区植被引导修复中土壤水分精准提升提供了空间数据支持。 展开更多
关键词 土壤含水量 高光谱 偏最小二乘回归 最小二乘支持向量机 无人机 干旱阈值 引导修复
下载PDF
基于近红外光谱技术结合ARO-LSSVR的天麻中有效成分含量快速检测 被引量:1
14
作者 李珊珊 张付杰 +5 位作者 李丽霞 张浩 段星桅 史磊 崔秀明 李小青 《食品科学》 EI CAS CSCD 北大核心 2024年第4期207-213,共7页
为实现对天麻中天麻素和对羟基苯甲醇含量的快速、无损检测,以云南昭通乌天麻为实验对象,采集900~1 700 nm波长范围内的光谱数据。首先,采用卷积平滑和标准正态变量变换进行光谱数据预处理,其次通过竞争性自适应重加权采样法(competitiv... 为实现对天麻中天麻素和对羟基苯甲醇含量的快速、无损检测,以云南昭通乌天麻为实验对象,采集900~1 700 nm波长范围内的光谱数据。首先,采用卷积平滑和标准正态变量变换进行光谱数据预处理,其次通过竞争性自适应重加权采样法(competitive adapative reweighted sampling,CARS)与迭代保留信息变量算法进行特征波长的提取,根据基于特征波长建立最小二乘支持向量回归(least squares support vector machine,LSSVR)模型的结果,选择最佳特征波长提取方法。为了提高模型的准确率,本研究引入人工兔智能算法对LSSVR中的正则化参数γ和核函数密度σ2进行优化,并与粒子群优化算法(particle swarm optimization,PSO)、灰狼优化算法(grey wolf optimizer,GWO)进行对比,评估人工兔优化算法(artificial rabbits optimization,ARO)的优越性。结果表明,ARO算法在寻优速度、寻优能力上优于PSO、GWO;天麻素、对羟基苯甲醇的最佳预测模型均为CARS-AROLSSVR,其Rp2分别为0.969 6和0.957 7,预测均方根误差分别为0.014和0.020。综上,近红外光谱可用于天麻中有效成分的定量检测,本研究可为天麻快速检测装置的研发提供理论依据。 展开更多
关键词 近红外光谱 天麻 最小二乘支持向量回归 人工兔优化算法
下载PDF
基于PSO-LSSVR的机器人磨抛材料去除模型
15
作者 蔡鸣 朱光 +2 位作者 李论 赵吉宾 王奔 《组合机床与自动化加工技术》 北大核心 2024年第1期174-177,182,共5页
为了建立磨抛工艺参数与材料去除深度的关系,建立一种基于最小二乘法支持向量回归机(LSSVR)的材料去除深度预测模型,并引入粒子群优化(PSO)算法来优化LSSVR的超参数,可提高LSSVR模型的预测准确性和全局优寻能力。搭建叶片机器人砂带磨... 为了建立磨抛工艺参数与材料去除深度的关系,建立一种基于最小二乘法支持向量回归机(LSSVR)的材料去除深度预测模型,并引入粒子群优化(PSO)算法来优化LSSVR的超参数,可提高LSSVR模型的预测准确性和全局优寻能力。搭建叶片机器人砂带磨抛实验平台,设计并进行多工艺参数实验,考虑工艺参数:砂带粒度、砂带转速、进给速度、接触力和叶片表面曲率半径,获得叶片表面的材料去除深度,最终利用实验数据建立了PSO-LSSVR叶片材料去除深度预测模型。结果表明,PSO-LSSVR模型的预测准确率为95.37%,平均预测误差为0.003463,说明PSO-LSSVR模型具有较高的预测精度,并结合实际加工情况进行实验验证可行性,证明PSO-LSSVR模型可以有效合理地建立工艺参数与材料去除深度的关系。 展开更多
关键词 机器人砂带磨抛 预测模型 工艺参数 最小二乘法支持向量回归机 粒子群算法
下载PDF
可见-近红外与中红外光谱预测土壤养分的比较研究
16
作者 李学兰 李德成 +6 位作者 郑光辉 曾荣 蔡凯 高维常 潘文杰 姜超英 曾陨涛 《土壤学报》 CAS CSCD 北大核心 2024年第3期687-698,共12页
对土壤养分的快速和准确测定有助于适时指导施肥。为进一步研究可见-近红外(350~2500 nm)与中红外光谱(4000~650 cm^(–1))对土壤养分的预测能力,以贵州省500个土样为例,对光谱进行Savitzky-Golay(SG)平滑去噪处理,再用标准正态化(SNV)... 对土壤养分的快速和准确测定有助于适时指导施肥。为进一步研究可见-近红外(350~2500 nm)与中红外光谱(4000~650 cm^(–1))对土壤养分的预测能力,以贵州省500个土样为例,对光谱进行Savitzky-Golay(SG)平滑去噪处理,再用标准正态化(SNV)方法进行基线校正,然后分别应用偏最小二乘回归(PLSR)和支持向量机(SVM)两种方法进行建模,探讨了可见-近红外和中红外光谱对土壤全氮(TN)、全磷(TP)、全钾(TK)和碱解氮(AN)、有效磷(AP)、速效钾(AK)共六种土壤养分的预测效果。结果表明:(1)无论基于可见-近红外光谱还是中红外光谱,PLSR模型的预测精度整体均优于SVM模型。(2)中红外光谱对TN、TK和AN的预测精度均显著高于可见-近红外光谱,可见-近红外和中红外光谱均可以可靠地预测TN和TK(性能与四分位间隔距离的比率(RPIQ)大于2.10),中红外光谱可相对较可靠地预测AN(RPIQ=1.87);但两类光谱对TP、AP和AK的预测效果均较差(RPIQ<1.34)。(3)当变量投影重要性得分(VIP)大于1.5时,PLSR模型在中红外光谱区域预测TN和TK的重要波段多于可见-近红外光谱区域,TN的重要波段主要集中于可见-近红外光谱区域的1910和2207 nm附近,中红外光谱区域的1120、1000、960、910、770和668 cm^(–1)附近;TK的重要波段主要集中于可见-近红外光谱区域的540、2176、2225和2268 nm附近,中红外光谱区域的1040、960、910、776、720和668 cm^(–1)附近。因此,中红外光谱技术结合PLSR模型对土壤养分预测效果较好,可快速准确预测土壤TN和TK,可为指导适时施肥提供技术支撑。 展开更多
关键词 可见-近红外光谱 中红外光谱 土壤养分 偏最小二乘回归 支持向量机
下载PDF
基于慢特征分析与最小二乘支持向量回归集成的草酸钴合成过程粒度预报
17
作者 张晗 张淑宁 +1 位作者 刘珂 邓冠龙 《化工学报》 EI CSCD 北大核心 2024年第6期2313-2321,共9页
草酸钴合成过程是钴湿法冶炼的关键单元操作,其粒度分布是重要的质量指标,然而难以在线实时测量。同时,草酸钴合成过程通常存在非线性、多约束和慢时变特征。因此,提出一种集成慢特征分析(slow feature analysis,SFA)与最小二乘支持向... 草酸钴合成过程是钴湿法冶炼的关键单元操作,其粒度分布是重要的质量指标,然而难以在线实时测量。同时,草酸钴合成过程通常存在非线性、多约束和慢时变特征。因此,提出一种集成慢特征分析(slow feature analysis,SFA)与最小二乘支持向量回归(least square support vector regression,LSSVR)的草酸钴粒度预报模型对草酸钴合成过程质量指标实现在线测量。在该方法中,首先,SFA方法可以有效地捕获过程的慢特征向量,解决慢时变问题;然后,利用LSSVR方法建立慢特征与粒度之间的非线性关系模型,进而实现质量指标在线预报。最后,应用非线性的数值案例以及草酸钴合成过程数据,验证该方法的有效性。实验结果显示:相较于单一的径向基函数神经网络(radial basis function neural network,RBFNN)、LSSVR预测模型以及SFA与NN相结合的预报模型,所提方法在数值案例中的预测精度分别提升了13.31%、2.26%、1.72%;在草酸钴合成过程中的预测精度分别提升了13.27%、9.96%、8.92%。 展开更多
关键词 草酸钴合成过程 软测量 慢特征分析 最小二乘支持向量回归 化学过程 预测 神经网络
下载PDF
基于LSSVM-GA的沟灌入渗参数与糙率估算与验证
18
作者 周雯 白丹 +2 位作者 李一博 马鑫 白雪丽 《农业工程学报》 EI CAS CSCD 北大核心 2024年第18期62-69,共8页
入渗参数和糙率是沟灌设计和管理中需要确定的重要基本参数。该研究基于WinSRFR软件模拟结果构建样本集,通过最小二乘支持向量机(least squares support vector machines,LSSVM)回归模型来映射水流推进时间、消退时间与入渗参数、糙率... 入渗参数和糙率是沟灌设计和管理中需要确定的重要基本参数。该研究基于WinSRFR软件模拟结果构建样本集,通过最小二乘支持向量机(least squares support vector machines,LSSVM)回归模型来映射水流推进时间、消退时间与入渗参数、糙率之间的非线性关系,并在此基础上提出了结合最小二乘支持向量机和遗传算法(least squares support vector machines-genetic algorithm,LSSVM-GA)的参数估算方法,即利用LSSVM回归模型构建目标函数,并利用GA获得入渗参数和糙率的最优值。在4组尾端封闭沟试验基础上,将LSSVM-GA法与多元非线性回归(multiple nonlinear regression,MNR)及WinSRFR中的Merriam-Keller post-irrigation volume balance analysis(MK-PIVB)进行对比,结果表明,LSSVM-GA法估算的参数对进退水过程的拟合效果较优,其模拟的推进和消退过程均方根误差分别介于1.06~2.12 min和2.28~3.11 min之间,表明LSSVM-GA在估算入渗参数和糙率方面的可靠性,这有助于获得更精确的灌水技术要素,进而提高沟灌性能。 展开更多
关键词 灌溉 入渗 遗传算法 参数 糙率 最小二乘支持向量机回归
下载PDF
基于近红外光谱结合化学计量学的花椒品质快速评价研究
19
作者 张萌萌 杨孝红 +3 位作者 李海洋 高欢晴 李瑶 郭伦锋 《中国调味品》 CAS 北大核心 2024年第10期147-152,185,共7页
应用近红外光谱技术结合化学计量学建立花椒代表性成分的定量分析模型。采用紫外可见分光光度法测定不同批次花椒总酰胺和总黄酮含量,并测定挥发油含量。采集50批次花椒样品的近红外光谱,应用Kennard-Stone算法划分样本集。进一步采用... 应用近红外光谱技术结合化学计量学建立花椒代表性成分的定量分析模型。采用紫外可见分光光度法测定不同批次花椒总酰胺和总黄酮含量,并测定挥发油含量。采集50批次花椒样品的近红外光谱,应用Kennard-Stone算法划分样本集。进一步采用偏最小二乘回归(partial least squares regression,PLSR)和支持向量机(support vector machine,SVM)建立3个指标的含量预测模型,并比较各模型的性能。不同批次花椒样品总酰胺、总黄酮和挥发油含量分别为10.40%~29.09%、10.33%~24.73%、2.72%~8.04%。近红外光谱分别经MSC、SG平滑、SG平滑+MSC预处理后,应用SVM构建的花椒总酰胺、总黄酮和挥发油定量模型准确度较PLSR高,校正集决定系数(R_(C)^(2))分别为0.818,0.655,0.927,预则集决定系数(R_(P)^(2))分别为0.898,0.856,0.916。文章所建立的近红外光谱结合PLSR和SVM定量测定模型可以实现花椒类调味品的品质快速评价。 展开更多
关键词 花椒 近红外光谱 偏最小二乘回归 支持向量机 挥发油
下载PDF
基于电子鼻的玫瑰香葡萄货架期品质预测及新鲜度判别研究
20
作者 闫雨桐 史策 +4 位作者 韩帅 孙传恒 邢斌 刘峻 吉增涛 《山地农业生物学报》 2024年第6期1-7,20,共8页
本研究构建了一个快速准确的玫瑰香葡萄货架期品质判别预测模型,以最大化玫瑰香葡萄的商品价值。通过利用电子鼻分析技术对不同温度(0、10和20℃)条件下玫瑰香葡萄货架期内挥发性气体进行检测,采用偏最小二乘法(Partial Least Squares R... 本研究构建了一个快速准确的玫瑰香葡萄货架期品质判别预测模型,以最大化玫瑰香葡萄的商品价值。通过利用电子鼻分析技术对不同温度(0、10和20℃)条件下玫瑰香葡萄货架期内挥发性气体进行检测,采用偏最小二乘法(Partial Least Squares Regressions,PLS)和BP神经网络(Back Propagation Artificial Neural Network,BP-ANN)对货架期内玫瑰香葡萄的可溶性固形物(Solid Soluble Content,SSC)和总酸(Total Acid,TA)建立预测模型;为提高新鲜度判别模型的准确性,通过系统聚类分析将SSC、TA与感官评价信息进行融合,结合遗传算法优化支持向量机构建玫瑰香葡萄货架期新鲜度判别模型。结果表明:PLS和BP-ANN模型均可有效预测SSC和TA的含量,其中BP-ANN模型的预测精度更高(SSC模型的R^(2)=0.9694,RMSE=0.0094;TA模型的R^(2)=0.9183,RMSE=0.0025);基于品质信息融合的玫瑰香葡萄新鲜度判别模型的判别准确率为95%,本研究为更准确的预测玫瑰香葡萄的理化指标和判别新鲜度提供新的思路。 展开更多
关键词 玫瑰香葡萄 电子鼻 BP神经网络 支持向量机 偏最小二乘法
下载PDF
上一页 1 2 26 下一页 到第
使用帮助 返回顶部