The paper considers a multivariate partially linear model under independent errors,and investigates the asymptotic bias and variance-covariance for parametric component βand nonparametric component F(·)by the ...The paper considers a multivariate partially linear model under independent errors,and investigates the asymptotic bias and variance-covariance for parametric component βand nonparametric component F(·)by the GJS estimator and Kernel estimation.展开更多
In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-di...In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-dimensional models we refer to differ from conventional models in that the number of all parameters p and number of significant parameters s are both allowed to grow with the sample size T. When the field-specific knowledge is preliminary and in view of recent and potential affluence of data from genetics, finance and on-line social networks, etc., such(s, T, p)-triply diverging models enjoy ultimate flexibility in terms of modeling, and they can be used as a data-guided first step of investigation. However, model selection consistency and other theoretical properties were addressed only for independent data, leaving time series largely uncovered. On a simple linear regression model endowed with a weakly dependent sequence, this paper applies a penalized least squares(PLS) approach. Under regularity conditions, we show sign consistency, derive finite sample bound with high probability for estimation error, and prove that PLS estimate is consistent in L_2 norm with rate (s log s/T)~1/2.展开更多
In this paper, based on spline approximation, the authors propose a unified variable selection approach for single-index model via adaptive L1 penalty. The calculation methods of the proposed estimators are given on t...In this paper, based on spline approximation, the authors propose a unified variable selection approach for single-index model via adaptive L1 penalty. The calculation methods of the proposed estimators are given on the basis of the known lars algorithm. Under some regular conditions, the authors demonstrate the asymptotic properties of the proposed estimators and the oracle properties of adaptive LASSO(aL ASSO) variable selection. Simulations are used to investigate the performances of the proposed estimator and illustrate that it is effective for simultaneous variable selection as well as estimation of the single-index models.展开更多
基金Supported by the Anhui Provincial Natural Science Foundation(11040606M04) Supported by the National Natural Science Foundation of China(10871001,10971097)
文摘The paper considers a multivariate partially linear model under independent errors,and investigates the asymptotic bias and variance-covariance for parametric component βand nonparametric component F(·)by the GJS estimator and Kernel estimation.
基金supported by Natural Science Foundation of USA (Grant Nos. DMS1206464 and DMS1613338)National Institutes of Health of USA (Grant Nos. R01GM072611, R01GM100474 and R01GM120507)
文摘In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-dimensional models we refer to differ from conventional models in that the number of all parameters p and number of significant parameters s are both allowed to grow with the sample size T. When the field-specific knowledge is preliminary and in view of recent and potential affluence of data from genetics, finance and on-line social networks, etc., such(s, T, p)-triply diverging models enjoy ultimate flexibility in terms of modeling, and they can be used as a data-guided first step of investigation. However, model selection consistency and other theoretical properties were addressed only for independent data, leaving time series largely uncovered. On a simple linear regression model endowed with a weakly dependent sequence, this paper applies a penalized least squares(PLS) approach. Under regularity conditions, we show sign consistency, derive finite sample bound with high probability for estimation error, and prove that PLS estimate is consistent in L_2 norm with rate (s log s/T)~1/2.
基金supported by the National Natural Science Foundation of China under Grant No.61272041
文摘In this paper, based on spline approximation, the authors propose a unified variable selection approach for single-index model via adaptive L1 penalty. The calculation methods of the proposed estimators are given on the basis of the known lars algorithm. Under some regular conditions, the authors demonstrate the asymptotic properties of the proposed estimators and the oracle properties of adaptive LASSO(aL ASSO) variable selection. Simulations are used to investigate the performances of the proposed estimator and illustrate that it is effective for simultaneous variable selection as well as estimation of the single-index models.