In practice, predictors possess grouping structures spontaneously. Incorporation of such useful information can improve statistical modeling and inference. In addition, the high-dimensionality often leads to the colli...In practice, predictors possess grouping structures spontaneously. Incorporation of such useful information can improve statistical modeling and inference. In addition, the high-dimensionality often leads to the collinearity problem. The elastic net is an ideal method which is inclined to reflect a grouping effect. In this paper, we consider the problem of group selection and estimation in the sparse linear regression model in which predictors can be grouped. We investigate a group adaptive elastic-net and derive oracle inequalities and model consistency for the cases where group number is larger than the sample size. Oracle property is addressed for the case of the fixed group number. We revise the locally approximated coordinate descent algorithm to make our computation. Simulation and real data studies indicate that the group adaptive elastic-net is an alternative and competitive method for model selection of high-dimensional problems for the cases of group number being larger than the sample size.展开更多
In many applications,covariates can be naturally grouped.For example,for gene expression data analysis,genes belonging to the same pathway might be viewed as a group.This paper studies variable selection problem for c...In many applications,covariates can be naturally grouped.For example,for gene expression data analysis,genes belonging to the same pathway might be viewed as a group.This paper studies variable selection problem for censored survival data in the additive hazards model when covariates are grouped.A hierarchical regularization method is proposed to simultaneously estimate parameters and select important variables at both the group level and the within-group level.For the situations in which the number of parameters tends to∞as the sample size increases,we establish an oracle property and asymptotic normality property of the proposed estimators.Numerical results indicate that the hierarchically penalized method performs better than some existing methods such as lasso,smoothly clipped absolute deviation(SCAD)and adaptive lasso.展开更多
基金supported by National Natural Science Foundation of China(Grant No.11571219)the Open Research Fund Program of Key Laboratory of Mathematical Economics(SUFE)(Grant No.201309KF02)Ministry of Education,and Changjiang Scholars and Innovative Research Team in University(Grant No.IRT13077)
文摘In practice, predictors possess grouping structures spontaneously. Incorporation of such useful information can improve statistical modeling and inference. In addition, the high-dimensionality often leads to the collinearity problem. The elastic net is an ideal method which is inclined to reflect a grouping effect. In this paper, we consider the problem of group selection and estimation in the sparse linear regression model in which predictors can be grouped. We investigate a group adaptive elastic-net and derive oracle inequalities and model consistency for the cases where group number is larger than the sample size. Oracle property is addressed for the case of the fixed group number. We revise the locally approximated coordinate descent algorithm to make our computation. Simulation and real data studies indicate that the group adaptive elastic-net is an alternative and competitive method for model selection of high-dimensional problems for the cases of group number being larger than the sample size.
基金supported by National Natural Science Foundation of China(Grant Nos.11171112,11101114 and 11201190)National Statistical Science Research Major Program of China(Grant No.2011LZ051)
文摘In many applications,covariates can be naturally grouped.For example,for gene expression data analysis,genes belonging to the same pathway might be viewed as a group.This paper studies variable selection problem for censored survival data in the additive hazards model when covariates are grouped.A hierarchical regularization method is proposed to simultaneously estimate parameters and select important variables at both the group level and the within-group level.For the situations in which the number of parameters tends to∞as the sample size increases,we establish an oracle property and asymptotic normality property of the proposed estimators.Numerical results indicate that the hierarchically penalized method performs better than some existing methods such as lasso,smoothly clipped absolute deviation(SCAD)and adaptive lasso.