期刊文献+
共找到566篇文章
< 1 2 29 >
每页显示 20 50 100
Improved adaptive pruning algorithm for least squares support vector regression 被引量:4
1
作者 Runpeng Gao Ye San 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2012年第3期438-444,共7页
As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit... As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance. 展开更多
关键词 least squares support vector regression machine (LS- SVRM) PRUNING leave-one-out (LOO) error incremental learning decremental learning.
下载PDF
Flatness intelligent control via improved least squares support vector regression algorithm 被引量:2
2
作者 张秀玲 张少宇 +1 位作者 赵文保 徐腾 《Journal of Central South University》 SCIE EI CAS 2013年第3期688-695,共8页
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w... To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method. 展开更多
关键词 least squares support vector regression multi-output least squares support vector regression FLATNESS effective matrix predictive control
下载PDF
Fault diagnosis of power-shift steering transmission based on multiple outputs least squares support vector regression 被引量:2
3
作者 张英锋 马彪 +2 位作者 房京 张海岭 范昱珩 《Journal of Beijing Institute of Technology》 EI CAS 2011年第2期199-204,共6页
A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict t... A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict the future state of the power-shift steering transmission (PSST). A prediction model of PSST was gotten with multiple outputs LS-SVR. The model performance was greatly influenced by the penalty parameter γ and kernel parameter σ2 which were optimized using cross validation method. The training and prediction of the model were done with spectrometric oil analysis data. The predictive and actual values were compared and a fault in the second PSST was found. The research proved that this method had good accuracy in PSST fault prediction, and any possible problem in PSST could be found through a comparative analysis. 展开更多
关键词 least squares support vector regression(LS-SVR) fault diagnosis power-shift steering transmission (PSST)
下载PDF
Improved Scheme for Fast Approximation to Least Squares Support Vector Regression
4
作者 张宇宸 赵永平 +3 位作者 宋成俊 侯宽新 脱金奎 叶小军 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2014年第4期413-419,共7页
The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FS... The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR. 展开更多
关键词 support vector regression kernel method least squares SPARSENESS
下载PDF
Improved scheme to accelerate sparse least squares support vector regression
5
作者 Yongping Zhao Jianguo Sun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第2期312-317,共6页
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p... The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem. 展开更多
关键词 least squares support vector regression machine pruning algorithm iterative methodology classification.
下载PDF
Improved Twin Support Vector Machine Algorithm and Applications in Classification Problems
6
作者 Sun Yi Wang Zhouyang 《China Communications》 SCIE CSCD 2024年第5期261-279,共19页
The distribution of data has a significant impact on the results of classification.When the distribution of one class is insignificant compared to the distribution of another class,data imbalance occurs.This will resu... The distribution of data has a significant impact on the results of classification.When the distribution of one class is insignificant compared to the distribution of another class,data imbalance occurs.This will result in rising outlier values and noise.Therefore,the speed and performance of classification could be greatly affected.Given the above problems,this paper starts with the motivation and mathematical representing of classification,puts forward a new classification method based on the relationship between different classification formulations.Combined with the vector characteristics of the actual problem and the choice of matrix characteristics,we firstly analyze the orderly regression to introduce slack variables to solve the constraint problem of the lone point.Then we introduce the fuzzy factors to solve the problem of the gap between the isolated points on the basis of the support vector machine.We introduce the cost control to solve the problem of sample skew.Finally,based on the bi-boundary support vector machine,a twostep weight setting twin classifier is constructed.This can help to identify multitasks with feature-selected patterns without the need for additional optimizers,which solves the problem of large-scale classification that can’t deal effectively with the very low category distribution gap. 展开更多
关键词 FUZZY ordered regression(OR) relaxing variables twin support vector machine
下载PDF
A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking 被引量:2
7
作者 HU Lei YI Guoxing HUANG Chao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2021年第1期151-162,共12页
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a... Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance. 展开更多
关键词 least square support vector regression(LSSVR) global representative point ranking(GRPR) initial training dataset pruning strategy sparsity regression accuracy
下载PDF
Robust least squares projection twin SVM and its sparse solution 被引量:1
8
作者 ZHOU Shuisheng ZHANG Wenmeng +1 位作者 CHEN Li XU Mingliang 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2023年第4期827-838,共12页
Least squares projection twin support vector machine(LSPTSVM)has faster computing speed than classical least squares support vector machine(LSSVM).However,LSPTSVM is sensitive to outliers and its solution lacks sparsi... Least squares projection twin support vector machine(LSPTSVM)has faster computing speed than classical least squares support vector machine(LSSVM).However,LSPTSVM is sensitive to outliers and its solution lacks sparsity.Therefore,it is difficult for LSPTSVM to process large-scale datasets with outliers.In this paper,we propose a robust LSPTSVM model(called R-LSPTSVM)by applying truncated least squares loss function.The robustness of R-LSPTSVM is proved from a weighted perspective.Furthermore,we obtain the sparse solution of R-LSPTSVM by using the pivoting Cholesky factorization method in primal space.Finally,the sparse R-LSPTSVM algorithm(SR-LSPTSVM)is proposed.Experimental results show that SR-LSPTSVM is insensitive to outliers and can deal with large-scale datasets fastly. 展开更多
关键词 OUTLIERS robust least squares projection twin support vector machine(R-LSPTSVM) low-rank approximation sparse solution
下载PDF
Primal least squares twin support vector regression 被引量:5
9
作者 Hua-juan HUANG Shi-fei DING Zhong-zhi SHI 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2013年第9期722-732,共11页
The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this... The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this solution is affected by time and memory constraints when dealing with large datasets.In this paper,we present a least squares version for TSVR in the primal space,termed primal least squares TSVR (PLSTSVR).By introducing the least squares method,the inequality constraints of TSVR are transformed into equality constraints.Furthermore,we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space;thus,we need only to solve two systems of linear equations instead of two QPPs.Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time.We further investigate its validity in predicting the opening price of stock. 展开更多
关键词 twin support vector regression least squares method Primal space Stock prediction
原文传递
Application of Least Squares Support Vector Machine for Regression to Reliability Analysis 被引量:18
10
作者 郭秩维 白广忱 《Chinese Journal of Aeronautics》 SCIE EI CAS CSCD 2009年第2期160-166,共7页
In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functiona... In order to deal with the issue of huge computational cost very well in direct numerical simulation, the traditional response surface method (RSM) as a classical regression algorithm is used to approximate a functional relationship between the state variable and basic variables in reliability design. The algorithm has treated successfully some problems of implicit performance function in reliability analysis. However, its theoretical basis of empirical risk minimization narrows its range of applications for... 展开更多
关键词 mechanism design of spacecraft support vector machine for regression least squares support vector machine for regression Monte Carlo method RELIABILITY implicit performance function
原文传递
A Novel Method for Flatness Pattern Recognition via Least Squares Support Vector Regression 被引量:12
11
作者 ZHANG Xiu-ling, ZHANG Shao-yu, TAN Guang-zhong, ZHAO Wen-bao (Key Laboratory of Industrial Computer Control Engineering of Hebei Province, National Engineering Research Center for Equipment and Technology of Cold Strip Rolling, Yanshan University, Qinhuangdao 066004, Hebei, China) 《Journal of Iron and Steel Research International》 SCIE EI CAS CSCD 2012年第3期25-30,共6页
To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, q... To adapt to the new requirement of the developing flatness control theory and technology, cubic patterns were introduced on the basis of the traditional linear, quadratic and quartic flatness basic patterns. Linear, quadratic, cubic and quartic Legendre orthogonal polynomials were adopted to express the flatness basic patterns. In order to over- come the defects live in the existent recognition methods based on fuzzy, neural network and support vector regres- sion (SVR) theory, a novel flatness pattern recognition method based on least squares support vector regression (LS-SVR) was proposed. On this basis, for the purpose of determining the hyper-parameters of LS-SVR effectively and enhan- cing the recognition accuracy and generalization performance of the model, particle swarm optimization algorithm with leave-one-out (LOO) error as fitness function was adopted. To overcome the disadvantage of high computational complexity of naive cross-validation algorithm, a novel fast cross-validation algorithm was introduced to calculate the LOO error of LDSVR. Results of experiments on flatness data calculated by theory and a 900HC cold-rolling mill practically measured flatness signals demonstrate that the proposed approach can distinguish the types and define the magnitudes of the flatness defects effectively with high accuracy, high speed and strong generalization ability. 展开更多
关键词 flatness pattern recognition least squares support vector regression cross-validation
原文传递
Least squares twin support vector machine with asymmetric squared loss
12
作者 Wu Qing Li Feiyan +2 位作者 Zhang Hengchang Fan Jiulun Gao Xiaofeng 《The Journal of China Universities of Posts and Telecommunications》 EI CSCD 2023年第1期1-16,共16页
For classification problems,the traditional least squares twin support vector machine(LSTSVM)generates two nonparallel hyperplanes directly by solving two systems of linear equations instead of a pair of quadratic pro... For classification problems,the traditional least squares twin support vector machine(LSTSVM)generates two nonparallel hyperplanes directly by solving two systems of linear equations instead of a pair of quadratic programming problems(QPPs),which makes LSTSVM much faster than the original TSVM.But the standard LSTSVM adopting quadratic loss measured by the minimal distance is sensitive to noise and unstable to re-sampling.To overcome this problem,the expectile distance is taken into consideration to measure the margin between classes and LSTSVM with asymmetric squared loss(aLSTSVM)is proposed.Compared to the original LSTSVM with the quadratic loss,the proposed aLSTSVM not only has comparable computational accuracy,but also performs good properties such as noise insensitivity,scatter minimization and re-sampling stability.Numerical experiments on synthetic datasets,normally distributed clustered(NDC)datasets and University of California,Irvine(UCI)datasets with different noises confirm the great performance and validity of our proposed algorithm. 展开更多
关键词 classification least squares twin support vector machine ASYMMETRIC LOSS noise INSENSITIVITY
原文传递
基于LSTSVR模型的边缘计算预测变压器平均油温及绕组热点温度 被引量:14
13
作者 张磊 杨廷方 +2 位作者 李炜 刘志勇 曾程 《电力自动化设备》 EI CSCD 北大核心 2020年第8期197-202,共6页
变压器绕组的热点温度过高,会导致变压器绝缘脆解、裂化甚至击穿短路。因此及时、准确地预测出变压器绕组的热点温度,对提高变压器运行的安全可靠性至关重要。利用最小二乘双支持向量回归机(LSTSVR)作为边缘计算模型,将变压器油中气体... 变压器绕组的热点温度过高,会导致变压器绝缘脆解、裂化甚至击穿短路。因此及时、准确地预测出变压器绕组的热点温度,对提高变压器运行的安全可靠性至关重要。利用最小二乘双支持向量回归机(LSTSVR)作为边缘计算模型,将变压器油中气体色谱分析数据信息与变压器负载电流、环境温度、顶层油温、上死角温度等变压器运行信息结合,构建监测系统架构,预测变压器的平均油温,并计算出绕组热点温度。将所提方法得到的数据与实测数据进行对比,结果利用LSTSVR模型实现了变压器平均油温及绕组热点温度的准确预测,且该模型的预测精度优于最小二乘支持向量回归机模型,有效地提高了绕组热点温度测量的精度。现场实例也证明了所提方法的有效性和可靠性。 展开更多
关键词 变压器 最小二乘双支持向量回归机 绕组 热点温度 边缘计算
下载PDF
基于LSTSVR模型预测STATCOM晶闸管阀组本体温度 被引量:2
14
作者 徐强超 许庆超 +2 位作者 张敏 李雄均 杨廷方 《南方电网技术》 CSCD 北大核心 2020年第6期47-52,共6页
STATCOM晶闸管阀组本体温度过高,会导致其失效。因此及时、准确地预测出STATCOM晶闸管阀组本体温度对提高STATCOM运行的可靠性至关重要。本文利用最小二乘双支持向量回归机(least square twin support vector regression,LSTSVR)算法,将... STATCOM晶闸管阀组本体温度过高,会导致其失效。因此及时、准确地预测出STATCOM晶闸管阀组本体温度对提高STATCOM运行的可靠性至关重要。本文利用最小二乘双支持向量回归机(least square twin support vector regression,LSTSVR)算法,将STATCOM进水温度、回水温度、进水流量、IGBT模块散热材料的导热系数、STATCOM输出电压、STATCOM输出电流、晶闸管阀组的集电极电流共7个量作为输入量,构建了STATCOM晶闸管阀组本体温度预测模型。与现场实测数据对比的结果表明,利用LSTSVR模型实现了STATCOM晶闸管阀组本体温度的高精度预测,且模型的预测精度优于最小二乘支持向量回归机(least square support vector regression,LSSVR)模型。应用实例也验证了该方法的准确性和有效性。 展开更多
关键词 STATCOM晶闸管阀组 最小二乘双支持向量回归机 温度 预测
下载PDF
基于QCSSA-LSTSVR的氧化铝质量预测模型 被引量:1
15
作者 徐辰华 陈瑞 +3 位作者 宋海鹰 程若军 何俊隆 宋绍剑 《控制工程》 CSCD 北大核心 2022年第10期1857-1865,共9页
针对氧化铝焙烧过程氧化铝质量指标检测滞后的问题,提出融合最小二乘孪生支持向量机(LSTSVR)与量子混沌樽海鞘算法(QCSSA)方法,建立一种氧化铝焙烧过程的质量指标预测模型。首先,利用LSTSVR建立氧化铝质量指标预测模型;其次,针对LSTSVR... 针对氧化铝焙烧过程氧化铝质量指标检测滞后的问题,提出融合最小二乘孪生支持向量机(LSTSVR)与量子混沌樽海鞘算法(QCSSA)方法,建立一种氧化铝焙烧过程的质量指标预测模型。首先,利用LSTSVR建立氧化铝质量指标预测模型;其次,针对LSTSVR模型中核宽度系数和惩罚因子选取困难的问题,采用QCSSA进行LSTSVR模型结构参数寻优,利用Logistic混沌策略和量子局部搜索策略来提高SSA的全局寻优能力;最后,利用实际生产数据对所提方法进行实验验证。仿真结果表明,QCSSA优化LSTSVR的方法具有较好的预测效果。 展开更多
关键词 质量指标预测 樽海鞘算法 最小二乘孪生支持向量机 Logistics混沌映射 量子局部搜索
下载PDF
基于LSTSVR的路基沉降组合预测模型 被引量:1
16
作者 周永阳 张锐 +1 位作者 张恒煜 丁鹏 《哈尔滨理工大学学报》 CAS 北大核心 2017年第6期62-66,共5页
鉴于路基沉降各种单相预测模型均有其适用范围,总体预测波动性较大,精度较低,提出基于最小二乘双支持向量回归机(LSTSVR,least square twin support vector regression)的路基沉降组合预测模型。该模型的核心是根据路基沉降的发展规律... 鉴于路基沉降各种单相预测模型均有其适用范围,总体预测波动性较大,精度较低,提出基于最小二乘双支持向量回归机(LSTSVR,least square twin support vector regression)的路基沉降组合预测模型。该模型的核心是根据路基沉降的发展规律及其沉降曲线的特点,选择具有S型特点的成长曲线特征的单相预测模型;以各单项预测模型预测结果作为最小二乘双支持向量回归机的输入向量,构建路基沉降组合预测模型。对比试验表明:提出方法具有更好的预测精度和稳定性。 展开更多
关键词 路基沉降预测 组合预测 最小二乘双支持向量回归机
下载PDF
Quadratic Kernel-Free Least Square Twin Support Vector Machine for Binary Classification Problems 被引量:2
17
作者 Qian-Qian Gao Yan-Qin Bai Ya-Ru Zhan 《Journal of the Operations Research Society of China》 EI CSCD 2019年第4期539-559,共21页
In this paper,a new quadratic kernel-free least square twin support vector machine(QLSTSVM)is proposed for binary classification problems.The advantage of QLSTSVM is that there is no need to select the kernel function... In this paper,a new quadratic kernel-free least square twin support vector machine(QLSTSVM)is proposed for binary classification problems.The advantage of QLSTSVM is that there is no need to select the kernel function and related parameters for nonlinear classification problems.After using consensus technique,we adopt alternating direction method of multipliers to solve the reformulated consensus QLSTSVM directly.To reduce CPU time,the Karush-Kuhn-Tucker(KKT)conditions is also used to solve the QLSTSVM.The performance of QLSTSVM is tested on two artificial datasets and several University of California Irvine(UCI)benchmark datasets.Numerical results indicate that the QLSTSVM may outperform several existing methods for solving twin support vector machine with Gaussian kernel in terms of the classification accuracy and operation time. 展开更多
关键词 twin support vector machine Quadratic kernel-free least square Binary classification
原文传递
End-point dynamic control of basic oxygen furnace steelmaking based on improved unconstrained twin support vector regression 被引量:1
18
作者 Chuang Gao Ming-gang Shen +2 位作者 Xiao-ping Liu Nan-nan Zhao Mao-xiang Chu 《Journal of Iron and Steel Research International》 SCIE EI CAS CSCD 2020年第1期42-54,共13页
In order to improve the end-point hit rate of basic oxygen furnace steelmaking,a novel dynamic control model was proposed based on an improved twin support vector regression algorithm.The controlled objects were the e... In order to improve the end-point hit rate of basic oxygen furnace steelmaking,a novel dynamic control model was proposed based on an improved twin support vector regression algorithm.The controlled objects were the end-point carbon content and temperature.The proposed control model was established by using the low carbon steel samples collected from a steel plant,which consists of two prediction models,a preprocess model,two regulation units,a controller and a basic oxygen furnace.The test results of 100 heats show that the prediction models can achieve a double hit rate of 90%within the error bound of 0.005 wt.%C and 15℃.The preprocess model was used to predict an initial end-blow oxygen volume.However,the double hit rate of the carbon con tent and temperature only achieves 65%.Then,the oxygen volume and coolant additi ons were adjusted by the regulation units to improve the hit rate.Finally,the double hit rate after the regulation is reached up to 90%.The results indicate that the proposed dynamic control model is efficient to guide the real production for low carbon steel,and the modeling method is also suitable for the applications of other steel grades. 展开更多
关键词 STEELMAKING Basic OXYGEN FURNACE End-point control twin support vector regression Wavelet transform
原文传递
Short Term Electric Load Prediction by Incorporation of Kernel into Features Extraction Regression Technique
19
作者 Ruaa Mohamed-Rashad Ghandour Jun Li 《Smart Grid and Renewable Energy》 2017年第1期31-45,共15页
Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a rea... Accurate load prediction plays an important role in smart power management system, either for planning, facing the increasing of load demand, maintenance issues, or power distribution system. In order to achieve a reasonable prediction, authors have applied and compared two features extraction technique presented by kernel partial least square regression and kernel principal component regression, and both of them are carried out by polynomial and Gaussian kernels to map the original features’ to high dimension features’ space, and then draw new predictor variables known as scores and loadings, while kernel principal component regression draws the predictor features to construct new predictor variables without any consideration to response vector. In contrast, kernel partial least square regression does take the response vector into consideration. Models are simulated by three different cities’ electric load data, which used historical load data in addition to weekends and holidays as common predictor features for all models. On the other hand temperature has been used for only one data as a comparative study to measure its effect. Models’ results evaluated by three statistic measurements, show that Gaussian Kernel Partial Least Square Regression offers the more powerful features and significantly can improve the load prediction performance than other presented models. 展开更多
关键词 Short TERM Load PREDICTION support vector regression (SVR) KERNEL Principal Component regression (KPCR) KERNEL PARTIAL least SQUARE regression (KPLSR)
下载PDF
基于高光谱成像技术的涌泉蜜桔糖度最优检测位置 被引量:1
20
作者 李斌 万霞 +4 位作者 刘爱伦 邹吉平 卢英俊 姚迟 刘燕德 《中国光学(中英文)》 EI CAS CSCD 北大核心 2024年第1期128-139,共12页
本文旨在探索涌泉蜜桔糖度的最优检测位置和最佳预测模型,以便为蜜桔糖度检测分级提供理论依据。本文利用波长为390.2~981.3 nm的高光谱成像系统对涌泉蜜桔糖度最佳检测位置进行研究,将涌泉蜜桔的花萼、果茎、赤道和全局的光谱信息与其... 本文旨在探索涌泉蜜桔糖度的最优检测位置和最佳预测模型,以便为蜜桔糖度检测分级提供理论依据。本文利用波长为390.2~981.3 nm的高光谱成像系统对涌泉蜜桔糖度最佳检测位置进行研究,将涌泉蜜桔的花萼、果茎、赤道和全局的光谱信息与其对应部位的糖度结合,建立其预测模型。使用标准正态变量变换(SNV)、多元散射校正(MSC)、基线校准(Baseline)和SG平滑(Savitzkv-Golay)4种预处理方法对不同部位的原始光谱进行预处理,用预处理后的光谱数据建立偏最小二乘回归(PLSR)和最小二乘支持向量机(LSSVM)模型。找出蜜桔不同部位的最佳预处理方式,对经过最佳预处理后的光谱数据采用竞争性自适应重加权算法(CARS)和无信息变量消除法(UVE)进行特征波长筛选。最后,用筛选后的光谱数据建立PLSR和LSSVM模型并进行分析比较。研究结果表明,全局的MSC-CARS-LSSVM模型预测效果最佳,其预测集相关系数Rp=0.955,均方根误差RMSEP=0.395,其次是蜜桔赤道部位的SNV-PLSR模型,其预测集相关系数Rp=0.936,均方根误差RMSEP=0.37。两者预测集相关系数相近,因此可将赤道位置作为蜜桔糖度的最优检测位置。本研究表明根据蜜桔不同部位建立的糖度预测模型的预测效果有所差异,研究最优检测位置和最佳预测模型可以为蜜桔进行糖度检测分级提供理论依据。 展开更多
关键词 涌泉蜜桔 高光谱 糖度 偏最小二乘回归 最小二乘支持向量机
下载PDF
上一页 1 2 29 下一页 到第
使用帮助 返回顶部