When multicollinearity is present in a set of the regression variables,the least square estimate of the regression coefficient tends to be unstable and it may lead to erroneous inference.In this paper,generalized ridg...When multicollinearity is present in a set of the regression variables,the least square estimate of the regression coefficient tends to be unstable and it may lead to erroneous inference.In this paper,generalized ridge estimate(K)of the regression coefficient=vec(B)is considered in multivaiale linear regression model.The MSE of the above estimate is less than the MSE of the least square estimate by choosing the ridge parameter matrix K.Moreover,it is pointed out that the Criterion MSE for choosing matrix K of generalized ridge estimate has several weaknesses.In order to overcome these weaknesses,a new family of criteria Q(c)is adpoted which includes the criterion MSE and criterion LS as its special case.The good properties of the criteria Q(c)are proved and discussed from theoretical point of view.The statistical meaning of the scale c is explained and the methods of determining c are also given.展开更多
Longitudinal trends of observations can be estimated using the generalized multivariate analysis of variance (GMANOVA) model proposed by [10]. In the present paper, we consider estimating the trends nonparametrically ...Longitudinal trends of observations can be estimated using the generalized multivariate analysis of variance (GMANOVA) model proposed by [10]. In the present paper, we consider estimating the trends nonparametrically using known basis functions. Then, as in nonparametric regression, an overfitting problem occurs. [13] showed that the GMANOVA model is equivalent to the varying coefficient model with non-longitudinal covariates. Hence, as in the case of the ordinary linear regression model, when the number of covariates becomes large, the estimator of the varying coefficient becomes unstable. In the present paper, we avoid the overfitting problem and the instability problem by applying the concept behind penalized smoothing spline regression and multivariate generalized ridge regression. In addition, we propose two criteria to optimize hyper parameters, namely, a smoothing parameter and ridge parameters. Finally, we compare the ordinary least square estimator and the new estimator.展开更多
We considered the following semiparametric regres-sion model yi = X iT β+ s ( t i ) + ei (i =1,2,,n). First,the general-ized ridge estimators of both parameters and non-parameters are given without a restrained desig...We considered the following semiparametric regres-sion model yi = X iT β+ s ( t i ) + ei (i =1,2,,n). First,the general-ized ridge estimators of both parameters and non-parameters are given without a restrained design matrix. Second,the generalized ridge estimator will be compared with the penalized least squares estimator under a mean squares error,and some conditions in which the former excels the latter are given. Finally,the validity and feasibility of the method is illustrated by a simulation example.展开更多
The solution properties of semiparametric model are analyzed, especially that penalized least squares for semiparametric model will be invalid when the matrix B^TPB is ill-posed or singular. According to the principle...The solution properties of semiparametric model are analyzed, especially that penalized least squares for semiparametric model will be invalid when the matrix B^TPB is ill-posed or singular. According to the principle of ridge estimate for linear parametric model, generalized penalized least squares for semiparametric model are put forward, and some formulae and statistical properties of estimates are derived. Finally according to simulation examples some helpful conclusions are drawn.展开更多
In this paper,we propose a new biased estimator of the regression parameters,the generalized ridge and principal correlation estimator.We present its some properties and prove that it is superior to LSE(least squares ...In this paper,we propose a new biased estimator of the regression parameters,the generalized ridge and principal correlation estimator.We present its some properties and prove that it is superior to LSE(least squares estimator),principal correlation estimator,ridge and principal correlation estimator under MSE(mean squares error) and PMC(Pitman closeness) criterion,respectively.展开更多
基金The projects Supported by Natural Science Foundation of Fujian Province
文摘When multicollinearity is present in a set of the regression variables,the least square estimate of the regression coefficient tends to be unstable and it may lead to erroneous inference.In this paper,generalized ridge estimate(K)of the regression coefficient=vec(B)is considered in multivaiale linear regression model.The MSE of the above estimate is less than the MSE of the least square estimate by choosing the ridge parameter matrix K.Moreover,it is pointed out that the Criterion MSE for choosing matrix K of generalized ridge estimate has several weaknesses.In order to overcome these weaknesses,a new family of criteria Q(c)is adpoted which includes the criterion MSE and criterion LS as its special case.The good properties of the criteria Q(c)are proved and discussed from theoretical point of view.The statistical meaning of the scale c is explained and the methods of determining c are also given.
文摘Longitudinal trends of observations can be estimated using the generalized multivariate analysis of variance (GMANOVA) model proposed by [10]. In the present paper, we consider estimating the trends nonparametrically using known basis functions. Then, as in nonparametric regression, an overfitting problem occurs. [13] showed that the GMANOVA model is equivalent to the varying coefficient model with non-longitudinal covariates. Hence, as in the case of the ordinary linear regression model, when the number of covariates becomes large, the estimator of the varying coefficient becomes unstable. In the present paper, we avoid the overfitting problem and the instability problem by applying the concept behind penalized smoothing spline regression and multivariate generalized ridge regression. In addition, we propose two criteria to optimize hyper parameters, namely, a smoothing parameter and ridge parameters. Finally, we compare the ordinary least square estimator and the new estimator.
基金Supported by the Key Project of Chinese Ministry of Educa-tion (209078)the Scientific Research Item of Hubei Provincial Department of Education (D20092207)
文摘We considered the following semiparametric regres-sion model yi = X iT β+ s ( t i ) + ei (i =1,2,,n). First,the general-ized ridge estimators of both parameters and non-parameters are given without a restrained design matrix. Second,the generalized ridge estimator will be compared with the penalized least squares estimator under a mean squares error,and some conditions in which the former excels the latter are given. Finally,the validity and feasibility of the method is illustrated by a simulation example.
基金Funded by the National Nature Science Foundation of China(No.40274005) .
文摘The solution properties of semiparametric model are analyzed, especially that penalized least squares for semiparametric model will be invalid when the matrix B^TPB is ill-posed or singular. According to the principle of ridge estimate for linear parametric model, generalized penalized least squares for semiparametric model are put forward, and some formulae and statistical properties of estimates are derived. Finally according to simulation examples some helpful conclusions are drawn.
基金Foundation item: the National Natural Science Foundation of China (Nos. 60736047 10671007+2 种基金 60772036) the Foundation of Beijing Jiaotong University (Nos. 2006XM037 2007XM046).
文摘In this paper,we propose a new biased estimator of the regression parameters,the generalized ridge and principal correlation estimator.We present its some properties and prove that it is superior to LSE(least squares estimator),principal correlation estimator,ridge and principal correlation estimator under MSE(mean squares error) and PMC(Pitman closeness) criterion,respectively.