Consider the regression model y_i=x_iβ+g(t_i)+e_i for i=1,2,...,n. Here g(·) is an unknown function, β is a parameter to be estimated, and e_i are random errors. Based on g(·) estimated by kernel type esti...Consider the regression model y_i=x_iβ+g(t_i)+e_i for i=1,2,...,n. Here g(·) is an unknown function, β is a parameter to be estimated, and e_i are random errors. Based on g(·) estimated by kernel type estimator for the case where (x_i,t_i) are i. i. d. design points, the adaptive estimator of β is investigated, and some results about the asymptotically optimal convergence rates of the estimates are also obtained. In the meantime, the family of nonparametric estimates of g(·) including the known kernel and nearest neighbor estimates is proposed. Based on the nonparametric estimate for the case that (x_i,t_i) are known and nonrandom, the asymptotic normality of least squares estimator of β is proved.展开更多
基金Project sunoorted by the National Natural Science Foundation of China
文摘Consider the regression model y_i=x_iβ+g(t_i)+e_i for i=1,2,...,n. Here g(·) is an unknown function, β is a parameter to be estimated, and e_i are random errors. Based on g(·) estimated by kernel type estimator for the case where (x_i,t_i) are i. i. d. design points, the adaptive estimator of β is investigated, and some results about the asymptotically optimal convergence rates of the estimates are also obtained. In the meantime, the family of nonparametric estimates of g(·) including the known kernel and nearest neighbor estimates is proposed. Based on the nonparametric estimate for the case that (x_i,t_i) are known and nonrandom, the asymptotic normality of least squares estimator of β is proved.