期刊文献+
共找到3,643篇文章
< 1 2 183 >
每页显示 20 50 100
Revisiting Akaike’s Final Prediction Error and the Generalized Cross Validation Criteria in Regression from the Same Perspective: From Least Squares to Ridge Regression and Smoothing Splines
1
作者 Jean Raphael Ndzinga Mvondo Eugène-Patrice Ndong Nguéma 《Open Journal of Statistics》 2023年第5期694-716,共23页
In regression, despite being both aimed at estimating the Mean Squared Prediction Error (MSPE), Akaike’s Final Prediction Error (FPE) and the Generalized Cross Validation (GCV) selection criteria are usually derived ... In regression, despite being both aimed at estimating the Mean Squared Prediction Error (MSPE), Akaike’s Final Prediction Error (FPE) and the Generalized Cross Validation (GCV) selection criteria are usually derived from two quite different perspectives. Here, settling on the most commonly accepted definition of the MSPE as the expectation of the squared prediction error loss, we provide theoretical expressions for it, valid for any linear model (LM) fitter, be it under random or non random designs. Specializing these MSPE expressions for each of them, we are able to derive closed formulas of the MSPE for some of the most popular LM fitters: Ordinary Least Squares (OLS), with or without a full column rank design matrix;Ordinary and Generalized Ridge regression, the latter embedding smoothing splines fitting. For each of these LM fitters, we then deduce a computable estimate of the MSPE which turns out to coincide with Akaike’s FPE. Using a slight variation, we similarly get a class of MSPE estimates coinciding with the classical GCV formula for those same LM fitters. 展开更多
关键词 Linear Model Mean squared Prediction Error Final Prediction Error Generalized Cross Validation Least squares Ridge regression
下载PDF
Linear-regression models and algorithms based on the Total-Least-Squares principle 被引量:1
2
作者 Ding Shijun Jiang Weiping Shen Zhijuani 《Geodesy and Geodynamics》 2012年第2期42-46,共5页
In classical regression analysis, the error of independent variable is usually not taken into account in regression analysis. This paper presents two solution methods for the case that both the independent and the dep... In classical regression analysis, the error of independent variable is usually not taken into account in regression analysis. This paper presents two solution methods for the case that both the independent and the dependent variables have errors. These methods are derived from the condition-adjustment and indirect-adjustment models based on the Total-Least-Squares principle. The equivalence of these two methods is also proven in theory. 展开更多
关键词 Total-least-squares (TLS) principle regression analysis adjustment model EQUIVALENCE
下载PDF
A partial least-squares regression approach to land use studies in the Suzhou-Wuxi-Changzhou region 被引量:1
3
作者 ZHANG Yang ZHOU Chenghu ZHANG Yongmin 《Journal of Geographical Sciences》 SCIE CSCD 2007年第2期234-244,共11页
In several LUCC studies, statistical methods are being used to analyze land use data. A problem using conventional statistical methods in land use analysis is that these methods assume the data to be statistically ind... In several LUCC studies, statistical methods are being used to analyze land use data. A problem using conventional statistical methods in land use analysis is that these methods assume the data to be statistically independent. But in fact, they have the tendency to be dependent, a phenomenon known as multicollinearity, especially in the cases of few observations. In this paper, a Partial Least-Squares (PLS) regression approach is developed to study relationships between land use and its influencing factors through a case study of the Suzhou-Wuxi-Changzhou region in China. Multicollinearity exists in the dataset and the number of variables is high compared to the number of observations. Four PLS factors are selected through a preliminary analysis. The correlation analyses between land use and influencing factors demonstrate the land use character of rural industrialization and urbanization in the Suzhou-Wuxi-Changzhou region, meanwhile illustrate that the first PLS factor has enough ability to best describe land use patterns quantitatively, and most of the statistical relations derived from it accord with the fact. By the decreasing capacity of the PLS factors, the reliability of model outcome decreases correspondingly. 展开更多
关键词 land use multivariate data analysis partial least-squares regression Suzhou-Wuxi-Changzhou region MULTICOLLINEARITY
下载PDF
Based on Partial Least-squares Regression to Build up and Analyze the Model of Rice Evapotranspiration
4
作者 ZHAO Chang shan,FU Hong,HUANG Bu hai (Northeast Agricultural University,Harbin,Heilongjiang,150030,PRC) 《Journal of Northeast Agricultural University(English Edition)》 CAS 2003年第1期1-8,共8页
During the course of calculating the rice evapotranspiration using weather factors,we often find that some independent variables have multiple correlation.The phenomena can lead to the traditional multivariate regress... During the course of calculating the rice evapotranspiration using weather factors,we often find that some independent variables have multiple correlation.The phenomena can lead to the traditional multivariate regression model which based on least square method distortion.And the stability of the model will be lost.The model will be built based on partial least square regression in the paper,through applying the idea of main component analyze and typical correlation analyze,the writer picks up some component from original material.Thus,the writer builds up the model of rice evapotranspiration to solve the multiple correlation among the independent variables (some weather factors).At last,the writer analyses the model in some parts,and gains the satisfied result. 展开更多
关键词 Partial Least squares regression EVAPOTRANSPIRATION
下载PDF
Discrimination of Transgenic Rice Based on Near Infrared Reflectance Spectroscopy and Partial Least Squares Regression Discriminant Analysis 被引量:7
5
作者 ZHANG Long WANG Shan-shan +2 位作者 DING Yan-fei PAN Jia-rong ZHU Cheng 《Rice science》 SCIE CSCD 2015年第5期245-249,共5页
Near infrared reflectance spectroscopy (NIRS), a non-destructive measurement technique, was combined with partial least squares regression discrimiant analysis (PLS-DA) to discriminate the transgenic (TCTP and mi... Near infrared reflectance spectroscopy (NIRS), a non-destructive measurement technique, was combined with partial least squares regression discrimiant analysis (PLS-DA) to discriminate the transgenic (TCTP and mi166) and wild type (Zhonghua 11) rice. Furthermore, rice lines transformed with protein gene (OsTCTP) and regulation gene (Osmi166) were also discriminated by the NIRS method. The performances of PLS-DA in spectral ranges of 4 000-8 000 cm-1 and 4 000-10 000 cm-1 were compared to obtain the optimal spectral range. As a result, the transgenic and wild type rice were distinguished from each other in the range of 4 000-10 000 cm-1, and the correct classification rate was 100.0% in the validation test. The transgenic rice TCTP and mi166 were also distinguished from each other in the range of 4 000-10 000 cm-1, and the correct classification rate was also 100.0%. In conclusion, NIRS combined with PLS-DA can be used for the discrimination of transgenic rice. 展开更多
关键词 near infrared reflectance spectroscopy genetically-modified food regulation gene protein gene partial least squares regression discrimiant analysis
下载PDF
Estimating Wheat Grain Protein Content Using Multi-Temporal Remote Sensing Data Based on Partial Least Squares Regression 被引量:4
6
作者 LI Cun-jun WANG Ji-hua +4 位作者 WANG Qian WANG Da-cheng SONG Xiao-yu WANG Yan HUANGWen-jiang 《Journal of Integrative Agriculture》 SCIE CAS CSCD 2012年第9期1445-1452,共8页
Estimating wheat grain protein content by remote sensing is important for assessing wheat quality at maturity and making grains harvest and purchase policies. However, spatial variability of soil condition, temperatur... Estimating wheat grain protein content by remote sensing is important for assessing wheat quality at maturity and making grains harvest and purchase policies. However, spatial variability of soil condition, temperature, and precipitation will affect grain protein contents and these factors usually cannot be monitored accurately by remote sensing data from single image. In this research, the relationships between wheat protein content at maturity and wheat agronomic parameters at different growing stages were analyzed and multi-temporal images of Landsat TM were used to estimate grain protein content by partial least squares regression. Experiment data were acquired in the suburb of Beijing during a 2-yr experiment in the period from 2003 to 2004. Determination coefficient, average deviation of self-modeling, and deviation of cross- validation were employed to assess the estimation accuracy of wheat grain protein content. Their values were 0.88, 1.30%, 3.81% and 0.72, 5.22%, 12.36% for 2003 and 2004, respectively. The research laid an agronomic foundation for GPC (grain protein content) estimation by multi-temporal remote sensing. The results showed that it is feasible to estimate GPC of wheat from multi-temporal remote sensing data in large area. 展开更多
关键词 grain protein content agronomic parameters MULTI-TEMPORAL LANDSAT partial least squares regression
下载PDF
Improved adaptive pruning algorithm for least squares support vector regression 被引量:4
7
作者 Runpeng Gao Ye San 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2012年第3期438-444,共7页
As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit... As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance. 展开更多
关键词 least squares support vector regression machine (LS- SVRM) PRUNING leave-one-out (LOO) error incremental learning decremental learning.
下载PDF
Flatness intelligent control via improved least squares support vector regression algorithm 被引量:2
8
作者 张秀玲 张少宇 +1 位作者 赵文保 徐腾 《Journal of Central South University》 SCIE EI CAS 2013年第3期688-695,共8页
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w... To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method. 展开更多
关键词 least squares support vector regression multi-output least squares support vector regression FLATNESS effective matrix predictive control
下载PDF
Fault diagnosis of power-shift steering transmission based on multiple outputs least squares support vector regression 被引量:2
9
作者 张英锋 马彪 +2 位作者 房京 张海岭 范昱珩 《Journal of Beijing Institute of Technology》 EI CAS 2011年第2期199-204,共6页
A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict t... A method of multiple outputs least squares support vector regression (LS-SVR) was developed and described in detail, with the radial basis function (RBF) as the kernel function. The method was applied to predict the future state of the power-shift steering transmission (PSST). A prediction model of PSST was gotten with multiple outputs LS-SVR. The model performance was greatly influenced by the penalty parameter γ and kernel parameter σ2 which were optimized using cross validation method. The training and prediction of the model were done with spectrometric oil analysis data. The predictive and actual values were compared and a fault in the second PSST was found. The research proved that this method had good accuracy in PSST fault prediction, and any possible problem in PSST could be found through a comparative analysis. 展开更多
关键词 least squares support vector regression(LS-SVR) fault diagnosis power-shift steering transmission (PSST)
下载PDF
Characterizing and estimating rice brown spot disease severity using stepwise regression,principal component regression and partial least-square regression 被引量:13
10
作者 LIU Zhan-yu1, HUANG Jing-feng1, SHI Jing-jing1, TAO Rong-xiang2, ZHOU Wan3, ZHANG Li-li3 (1Institute of Agriculture Remote Sensing and Information System Application, Zhejiang University, Hangzhou 310029, China) (2Institute of Plant Protection and Microbiology, Zhejiang Academy of Agricultural Sciences, Hangzhou 310021, China) (3Plant Inspection Station of Hangzhou City, Hangzhou 310020, China) 《Journal of Zhejiang University-Science B(Biomedicine & Biotechnology)》 SCIE CAS CSCD 2007年第10期738-744,共7页
Detecting plant health conditions plays a key role in farm pest management and crop protection. In this study, measurement of hyperspectral leaf reflectance in rice crop (Oryzasativa L.) was conducted on groups of hea... Detecting plant health conditions plays a key role in farm pest management and crop protection. In this study, measurement of hyperspectral leaf reflectance in rice crop (Oryzasativa L.) was conducted on groups of healthy and infected leaves by the fungus Bipolaris oryzae (Helminthosporium oryzae Breda. de Hann) through the wavelength range from 350 to 2 500 nm. The percentage of leaf surface lesions was estimated and defined as the disease severity. Statistical methods like multiple stepwise regression, principal component analysis and partial least-square regression were utilized to calculate and estimate the disease severity of rice brown spot at the leaf level. Our results revealed that multiple stepwise linear regressions could efficiently estimate disease severity with three wavebands in seven steps. The root mean square errors (RMSEs) for training (n=210) and testing (n=53) dataset were 6.5% and 5.8%, respectively. Principal component analysis showed that the first principal component could explain approximately 80% of the variance of the original hyperspectral reflectance. The regression model with the first two principal components predicted a disease severity with RMSEs of 16.3% and 13.9% for the training and testing dataset, respec-tively. Partial least-square regression with seven extracted factors could most effectively predict disease severity compared with other statistical methods with RMSEs of 4.1% and 2.0% for the training and testing dataset, respectively. Our research demon-strates that it is feasible to estimate the disease severity of rice brown spot using hyperspectral reflectance data at the leaf level. 展开更多
关键词 HYPERSPECTRAL reflectance Rice BROWN SPOT PARTIAL least-square (PLS) regression STEPWISE regression Principal component regression (PCR)
下载PDF
Simultaneous Spectrophotometric Determination of Three Components Including Deoxyschizandrin by Partial Least Squares Regression 被引量:1
11
作者 张立庆 《Journal of Wuhan University of Technology(Materials Science)》 SCIE EI CAS 2005年第3期119-121,共3页
The computer auxiliary partial least squares is introduced to simultaneously determine the contents of Deoxyschizandin, Schisandrin, r-Schisandrin in the extracted solution of wuweizi. Regression analysis of the exper... The computer auxiliary partial least squares is introduced to simultaneously determine the contents of Deoxyschizandin, Schisandrin, r-Schisandrin in the extracted solution of wuweizi. Regression analysis of the experimental results shows that the average recovery of each component is all in the range from 98.9% to 110.3% , which means the partial least squares regression spectrophotometry can circumvent the overlappirtg of absorption spectrums of mlulti-components, so that sctisfactory results can be obtained without any scrapple pre-separation. 展开更多
关键词 DEOXYSCHIZANDRIN partial least squares regression spectrophotometry simultaneous determination
下载PDF
Partial Least Squares Regression Model to Predict Water Quality in Urban Water Distribution Systems 被引量:1
12
作者 骆碧君 赵元 +1 位作者 陈凯 赵新华 《Transactions of Tianjin University》 EI CAS 2009年第2期140-144,共5页
The water distribution system of one residential district in Tianjin is taken as an example to analyze the changes of water quality.Partial least squares(PLS) regression model,in which the turbidity and Fe are regarde... The water distribution system of one residential district in Tianjin is taken as an example to analyze the changes of water quality.Partial least squares(PLS) regression model,in which the turbidity and Fe are regarded as control objectives,is used to establish the statistical model.The experimental results indicate that the PLS regression model has good predicted results of water quality compared with the monitored data.The percentages of absolute relative error(below 15%,20%,30%) are 44.4%,66.7%,100%(turbidity) and 33.3%,44.4%,77.8%(Fe) on the 4th sampling point;77.8%,88.9%,88.9%(turbidity) and 44.4%,55.6%,66.7%(Fe) on the 5th sampling point. 展开更多
关键词 water distribution systems water quality TURBIDITY FE partial least squares regression
下载PDF
Application of partial least squares regression in data analysis of mining subsidence
13
作者 FENG Zun-de~(1,2), LU Xiu-shan~1, SHI Yu-feng~1, HUA Peng~1 (1. Shandong University of Science and Technology, Tai’an 271019, China 2. Xuzhou Normal University, Xuzhou 221116, China) 《中国有色金属学会会刊:英文版》 CSCD 2005年第S1期156-158,共3页
Based on the surveying data of strata-moving angle and the ordinary least squares regression, this paper is to construct, a regression model is constructed which is strata-moving parameter β concerning the coal bed o... Based on the surveying data of strata-moving angle and the ordinary least squares regression, this paper is to construct, a regression model is constructed which is strata-moving parameter β concerning the coal bed obliquity, coal thickness, mining depth, etc. But the regression is unsuccessful. The result is that none of the parameters is suited, this is not up to objective reality. This paper presents a novel method, partial least squares regression (PLS regression), to construct the statistic model of strata-moving parameter β. The experiment shows that the forecasting model is reasonable. 展开更多
关键词 strata-moving PARAMETER least squares regression multi-collinear PLS regression
下载PDF
Improved Scheme for Fast Approximation to Least Squares Support Vector Regression
14
作者 张宇宸 赵永平 +3 位作者 宋成俊 侯宽新 脱金奎 叶小军 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2014年第4期413-419,共7页
The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FS... The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR. 展开更多
关键词 support vector regression kernel method least squares SPARSENESS
下载PDF
Improved scheme to accelerate sparse least squares support vector regression
15
作者 Yongping Zhao Jianguo Sun 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第2期312-317,共6页
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p... The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem. 展开更多
关键词 least squares support vector regression machine pruning algorithm iterative methodology classification.
下载PDF
Comparison of dimension reduction-based logistic regression models for case-control genome-wide association study:principal components analysis vs.partial least squares 被引量:2
16
作者 Honggang Yi Hongmei Wo +9 位作者 Yang Zhao Ruyang Zhang Junchen Dai Guangfu Jin Hongxia Ma Tangchun Wu Zhibin Hu Dongxin Lin Hongbing Shen Feng Chen 《The Journal of Biomedical Research》 CAS CSCD 2015年第4期298-307,共10页
With recent advances in biotechnology, genome-wide association study (GWAS) has been widely used to identify genetic variants that underlie human complex diseases and traits. In case-control GWAS, typical statistica... With recent advances in biotechnology, genome-wide association study (GWAS) has been widely used to identify genetic variants that underlie human complex diseases and traits. In case-control GWAS, typical statistical strategy is traditional logistical regression (LR) based on single-locus analysis. However, such a single-locus analysis leads to the well-known multiplicity problem, with a risk of inflating type I error and reducing power. Dimension reduction-based techniques, such as principal component-based logistic regression (PC-LR), partial least squares-based logistic regression (PLS-LR), have recently gained much attention in the analysis of high dimensional genomic data. However, the perfor- mance of these methods is still not clear, especially in GWAS. We conducted simulations and real data application to compare the type I error and power of PC-LR, PLS-LR and LR applicable to GWAS within a defined single nucleotide polymorphism (SNP) set region. We found that PC-LR and PLS can reasonably control type I error under null hypothesis. On contrast, LR, which is corrected by Bonferroni method, was more conserved in all simulation settings. In particular, we found that PC-LR and PLS-LR had comparable power and they both outperformed LR, especially when the causal SNP was in high linkage disequilibrium with genotyped ones and with a small effective size in simulation. Based on SNP set analysis, we applied all three methods to analyze non-small cell lung cancer GWAS data. 展开更多
关键词 principal components analysis partial least squares-based logistic regression genome-wide association study type I error POWER
下载PDF
Quantum partial least squares regression algorithm for multiple correlation problem
17
作者 Yan-Yan Hou Jian Li +1 位作者 Xiu-Bo Chen Yuan Tian 《Chinese Physics B》 SCIE EI CAS CSCD 2022年第3期177-186,共10页
Partial least squares(PLS) regression is an important linear regression method that efficiently addresses the multiple correlation problem by combining principal component analysis and multiple regression. In this pap... Partial least squares(PLS) regression is an important linear regression method that efficiently addresses the multiple correlation problem by combining principal component analysis and multiple regression. In this paper, we present a quantum partial least squares(QPLS) regression algorithm. To solve the high time complexity of the PLS regression, we design a quantum eigenvector search method to speed up principal components and regression parameters construction. Meanwhile, we give a density matrix product method to avoid multiple access to quantum random access memory(QRAM)during building residual matrices. The time and space complexities of the QPLS regression are logarithmic in the independent variable dimension n, the dependent variable dimension w, and the number of variables m. This algorithm achieves exponential speed-ups over the PLS regression on n, m, and w. In addition, the QPLS regression inspires us to explore more potential quantum machine learning applications in future works. 展开更多
关键词 quantum machine learning partial least squares regression eigenvalue decomposition
下载PDF
A Novel Extension of Kernel Partial Least Squares Regression
18
作者 贾金明 仲伟俊 《Journal of Donghua University(English Edition)》 EI CAS 2009年第4期438-442,共5页
Based on continuum power regression(CPR) method, a novel derivation of kernel partial least squares(named CPR-KPLS) regression is proposed for approximating arbitrary nonlinear functions.Kernel function is used to map... Based on continuum power regression(CPR) method, a novel derivation of kernel partial least squares(named CPR-KPLS) regression is proposed for approximating arbitrary nonlinear functions.Kernel function is used to map the input variables(input space) into a Reproducing Kernel Hilbert Space(so called feature space),where a linear CPR-PLS is constructed based on the projection of explanatory variables to latent variables(components). The linear CPR-PLS in the high-dimensional feature space corresponds to a nonlinear CPR-KPLS in the original input space. This method offers a novel extension for kernel partial least squares regression(KPLS),and some numerical simulation results are presented to illustrate the feasibility of the proposed method. 展开更多
关键词 continuum regression partial least squares kernel function
下载PDF
A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking 被引量:2
19
作者 HU Lei YI Guoxing HUANG Chao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2021年第1期151-162,共12页
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a... Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance. 展开更多
关键词 least square support vector regression(LSSVR) global representative point ranking(GRPR) initial training dataset pruning strategy sparsity regression accuracy
下载PDF
Development of a Quantitative Prediction Support System Using the Linear Regression Method
20
作者 Jeremie Ndikumagenge Vercus Ntirandekura 《Journal of Applied Mathematics and Physics》 2023年第2期421-427,共7页
The development of prediction supports is a critical step in information systems engineering in this era defined by the knowledge economy, the hub of which is big data. Currently, the lack of a predictive model, wheth... The development of prediction supports is a critical step in information systems engineering in this era defined by the knowledge economy, the hub of which is big data. Currently, the lack of a predictive model, whether qualitative or quantitative, depending on a company’s areas of intervention can handicap or weaken its competitive capacities, endangering its survival. In terms of quantitative prediction, depending on the efficacy criteria, a variety of methods and/or tools are available. The multiple linear regression method is one of the methods used for this purpose. A linear regression model is a regression model of an explained variable on one or more explanatory variables in which the function that links the explanatory variables to the explained variable has linear parameters. The purpose of this work is to demonstrate how to use multiple linear regressions, which is one aspect of decisional mathematics. The use of multiple linear regressions on random data, which can be replaced by real data collected by or from organizations, provides decision makers with reliable data knowledge. As a result, machine learning methods can provide decision makers with relevant and trustworthy data. The main goal of this article is therefore to define the objective function on which the influencing factors for its optimization will be defined using the linear regression method. 展开更多
关键词 PREDICTION Linear regression Machine Learning Least squares Method
下载PDF
上一页 1 2 183 下一页 到第
使用帮助 返回顶部