This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theor...This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theorem which makes it possible to apply the central limit theorem to certain types of particular martingales. From the results obtained, confidence bounds for the hazard and the survival functions are provided.展开更多
In this work, we consider statistical diagnostic for random right censored data based on K-M product limit estimator. Under the definition of K-M product limit estimator, we obtain that the relation formula between es...In this work, we consider statistical diagnostic for random right censored data based on K-M product limit estimator. Under the definition of K-M product limit estimator, we obtain that the relation formula between estimators. Similar to complete data, we define likelihood displacement and likelihood ratio statistic. Through a real data application, we show that our proposed procedure is validity.展开更多
A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations o...A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations of the kernel smooth estimator are established, and from Bahadur representations the authors can show that this estimator is strongly consistent, asymptotically normal, and weakly convergent.展开更多
Let (X0,Y0), be i. i. d nonnegative random vectors with continuous survival distribution function be the product-limit estimator of S(s,t) suggested by Campbell and Foldes (1980). In this paper it is shown that under...Let (X0,Y0), be i. i. d nonnegative random vectors with continuous survival distribution function be the product-limit estimator of S(s,t) suggested by Campbell and Foldes (1980). In this paper it is shown that under some conditions a sequence of Gaussian processes Gn(s,t) can be constructed such that sup a. s.,for S,T which together satisfy a certain condition.展开更多
In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. wh...In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.展开更多
A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error ...A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error of the estimator are studied.展开更多
Based on random left truncated and right censored data we investigate the one-term Edgeworth expansion for the Studentized product-limit estimator, and show that the Edgeworth expansion is close to the exact distribut...Based on random left truncated and right censored data we investigate the one-term Edgeworth expansion for the Studentized product-limit estimator, and show that the Edgeworth expansion is close to the exact distribution of the Studentized product-limit estimator with a remainder of On(su-1/2).展开更多
For left truncated and right censored data, based on a strong representation of the product-limit estimator of the survival function, we derive the sufficient and necessary condition for the rate of strong uniform con...For left truncated and right censored data, based on a strong representation of the product-limit estimator of the survival function, we derive the sufficient and necessary condition for the rate of strong uniform convergence of the product-limit estimator over the whole line.展开更多
The composite quantile regression should provide estimation efficiency gain over a single quantile regression. In this paper, we extend composite quantile regression to nonparametric model with random censored data. T...The composite quantile regression should provide estimation efficiency gain over a single quantile regression. In this paper, we extend composite quantile regression to nonparametric model with random censored data. The asymptotic normality of the proposed estimator is established. The proposed methods are applied to the lung cancer data. Extensive simulations are reported, showing that the proposed method works well in practical settings.展开更多
Percutaneous vertebroplasty is a minimally invasive procedure that involves filling of a fractured vertebral body with bone cement to relieve pain and to restore the vertebral height. It is a safe and effective treatm...Percutaneous vertebroplasty is a minimally invasive procedure that involves filling of a fractured vertebral body with bone cement to relieve pain and to restore the vertebral height. It is a safe and effective treatment and is widely used for treating Osteoporotic Vertebral Compression Fracture. Despite of its beneficial advantages over primary conservative managements, adjacent level vertebral compression fracture remains the challenge for surgeons. Adjacent level vertebral compression fracture following percutaneous vertebroplasty using PMMA cement has been reported as a complication. Numerous risk factors have been reported for the occurrence of new adjacent VCFs after PVP. The multiple level osteoporotic vertebral compression fractures and the increasing age of the patients are directly proportional to the risk of developing new symptomatic adjacent vertebral compression fracture after PVP. Moreover, low BMD and cement leakage are other factors that directly affect the incidence of new symptomatic adjacent vertebral fractures. The aim of this review is to evaluate the adjacent level vertebral compression fracture following percutaneous vertebroplasty on the basis of radiographs, Kaplan-Meier Estimation index and also the factors that lead to adjacent level vertebral compression fractures.展开更多
Multiple myeloma (MM) is a type of cancer that remains incurable. In the last decade, most research into MM has focused on investigating the improvement in the therapeutic strategy. Our study assesses the survival pro...Multiple myeloma (MM) is a type of cancer that remains incurable. In the last decade, most research into MM has focused on investigating the improvement in the therapeutic strategy. Our study assesses the survival probability of 48 patients diagnosed with MM based on parametric and non-parametric techniques. We performed parametric survival analysis and found a well-def- ined probability distribution of the survival time to follow three-parameter lognormal. We then estimated the survival probability and compared it with the commonly used non-parametric Kaplan-Meier survival analysis of the survival times. The comparison of the survival probability estimates of the two methods revealed a better survival probability estimate by the parametric method than the Kaplan-Meier. The parametric survival analysis is more robust and efficient because it is based on a well-defined parametric probabilistic distribution, hence preferred over the non-parametric Kaplan-Meier. This study offers therapeutic significance for further enhancement in the treatment strategy of multiple myeloma cancer.展开更多
Receiver operating characteristic (ROC) curves are often used to study the two sample problem in medical studies. However, most data in medical studies are censored. Usually a natural estimator is based on the Kapla...Receiver operating characteristic (ROC) curves are often used to study the two sample problem in medical studies. However, most data in medical studies are censored. Usually a natural estimator is based on the Kaplan-Meier estimator. In this paper we propose a smoothed estimator based on kernel techniques for the ROC curve with censored data. The large sample properties of the smoothed estimator are established. Moreover, deficiency is considered in order to compare the proposed smoothed estimator of the ROC curve with the empirical one based on Kaplan-Meier estimator. It is shown that the smoothed estimator outperforms the direct empirical estimator based on the Kaplan-Meier estimator under the criterion of deficiency. A simulation study is also conducted and a real data is analyzed.展开更多
The strong limit results of oscillation modulus of PL-process are established in this paper when the density function is not continuous function for censored data. The rates of convergence of oscillation modulus of PL...The strong limit results of oscillation modulus of PL-process are established in this paper when the density function is not continuous function for censored data. The rates of convergence of oscillation modulus of PL-process are sharp under week condition. These results can be used to derive laws of the iterated logarithm of random bandwidth kernel estimator and nearest neighborhood estimator of density under continuous conditions of density function being not assumed.展开更多
It is of great interest to estimate quantile residual lifetime in medical science and many other fields. In survival analysis, Kaplan-Meier(K-M) estimator has been widely used to estimate the survival distribution. ...It is of great interest to estimate quantile residual lifetime in medical science and many other fields. In survival analysis, Kaplan-Meier(K-M) estimator has been widely used to estimate the survival distribution. However, it is well-known that the K-M estimator is not continuous, thus it can not always be used to calculate quantile residual lifetime. In this paper, the authors propose a kernel smoothing method to give an estimator of quantile residual lifetime. By using modern empirical process techniques, the consistency and the asymptotic normality of the proposed estimator are provided neatly.The authors also present the empirical small sample performances of the estimator. Deficiency is introduced to compare the performance of the proposed estimator with the naive unsmoothed estimator of the quantile residaul lifetime. Further simulation studies indicate that the proposed estimator performs very well.展开更多
Effects of many medical procedures appear after a time lag, when a significant change occurs in subjects’ failure rate. This paper focuses on the detection and estimation of such changes which is important for the ev...Effects of many medical procedures appear after a time lag, when a significant change occurs in subjects’ failure rate. This paper focuses on the detection and estimation of such changes which is important for the evaluation and comparison of treatments and prediction of their effects. Unlike the classical change-point model, measurements may still be identically distributed, and the change point is a parameter of their common survival function. Some of the classical change-point detection techniques can still be used but the results are different. Contrary to the classical model, the maximum likelihood estimator of a change point appears consistent, even in presence of nuisance parameters. However, a more efficient procedure can be derived from Kaplan-Meier estimation of the survival function followed by the least-squares estimation of the change point. Strong consistency of these estimation schemes is proved. The finite-sample properties are examined by a Monte Carlo study. Proposed methods are applied to a recent clinical trial of the treatment program for strong drug dependence.展开更多
The analysis of survival data is a major focus of statistics. Interval censored data reflect uncertainty as to the exact times the units failed within an interval. This type of data frequently comes from tests or situ...The analysis of survival data is a major focus of statistics. Interval censored data reflect uncertainty as to the exact times the units failed within an interval. This type of data frequently comes from tests or situations where the objects of interest are not constantly monitored. Thus events are known only to have occurred between the two observation periods. Interval censoring has become increasingly common in the areas that produce failure time data. This paper explores the statistical analysis of interval-censored failure time data with applications. Three different data sets, namely Breast Cancer, Hemophilia, and AIDS data were used to illustrate the methods during this study. Both parametric and nonparametric methods of analysis are carried out in this study. Theory and methodology of fitted models for the interval-censored data are described. Fitting of parametric and non-parametric models to three real data sets are considered. Results derived from different methods are presented and also compared.展开更多
In cancer survival analysis, it is very frequently to estimate the confidence intervals for survival probabilities.But this calculation is not commonly involve in most popular computer packages, or only one methods of...In cancer survival analysis, it is very frequently to estimate the confidence intervals for survival probabilities.But this calculation is not commonly involve in most popular computer packages, or only one methods of estimation in the packages. In the present Paper, we will describe a microcomputer Program for estimating the confidence intervals of survival probabilities, when the survival functions are estimated using Kaplan-Meier product-limit or life-table method. There are five methods of estimation in the program (SPCI), which are the classical(based on Greenwood's formula of variance of S(ti), Rothman-Wilson, arcsin transformation, log(-Iog) transformation, Iogit transformation methods. Two example analysis are given for testing the performances of the program running.展开更多
Lung cancer is one of the leading causes of death worldwide, accounting for an estimated 2.1 million cases in 2018. To analyze the risk factors behind the lung cancer survival, this paper employs two main models: Kapl...Lung cancer is one of the leading causes of death worldwide, accounting for an estimated 2.1 million cases in 2018. To analyze the risk factors behind the lung cancer survival, this paper employs two main models: Kaplan-Meier estimator and Cox proportional hazard model [1]. Also, log-rank test and wald test are utilized to test whether a correlation exists or not, which is discussed in detail in later parts of the paper. The aim is to find out the most influential factors for the survival probability of lung cancer patients. To summarize the results, stage of cancer is always a significant factor for lung cancer survival, and time has to be taken into account when analyzing the survival rate of patients in our data sample, which is from TCGA. Future study on lung cancer is also required to make improvement for the treatment of lung cancer, as our data sample might not represent the overall condition of patients diagnosed with lung cancer;also, more appropriate and advanced models should be employed in order to reflect factors that can affect survival rate of patients with lung cancer in detail.展开更多
We propose a new nonparametric method for assessing non-inferiority of an experimental therapy compared to a standard of care. The ratio μE/μR of true median survival times is the parameter of interest. This is of c...We propose a new nonparametric method for assessing non-inferiority of an experimental therapy compared to a standard of care. The ratio μE/μR of true median survival times is the parameter of interest. This is of considerable interest in clinical trials of generic drugs. We think of the ratio mE/mR of the sample medians as a point estimate of the ratioμE/μR. We use the Fieller-Hinkley distribution of the ratio of two normally distributed random variables to derive an unbiased level-α test of inferiority null hypothesis, which is stated in terms of the ratio μE/μR and a pre-specified fixed non-inferiority margin δ. We also explain how to assess equivalence and non-inferiority using bootstrap equivalent confidence intervals on the ratioμE/μR. The proposed new test does not require the censoring distributions for the two arms to be equal and it does not require the hazard rates to be proportional. If the proportional hazards assumption holds good, the proposed new test is more attractive. We also discuss sample size determination. We claim that our test procedure is simple and attains adequate power for moderate sample sizes. We extend the proposed test procedure to stratified analysis. We propose a “two one-sided tests” approach for assessing equivalence.展开更多
A family of tests for the presence of regression effect under proportional and non-proportional hazards models is described. The non-proportional hazards model, although not completely general, is very broad and inclu...A family of tests for the presence of regression effect under proportional and non-proportional hazards models is described. The non-proportional hazards model, although not completely general, is very broad and includes a large number of possibilities. In the absence of restrictions, the regression coefficient, β(t), can be any real function of time. When β(t) = β, we recover the proportional hazards model which can then be taken as a special case of a non-proportional hazards model. We study tests of the null hypothesis;H0:β(t) = 0 for all t against alternatives such as;H1:∫β(t)dF(t) ≠ 0 or H1:β(t) ≠ 0 for some t. In contrast to now classical approaches based on partial likelihood and martingale theory, the development here is based on Brownian motion, Donsker’s theorem and theorems from O’Quigley [1] and Xu and O’Quigley [2]. The usual partial likelihood score test arises as a special case. Large sample theory follows without special arguments, such as the martingale central limit theorem, and is relatively straightforward.展开更多
文摘This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theorem which makes it possible to apply the central limit theorem to certain types of particular martingales. From the results obtained, confidence bounds for the hazard and the survival functions are provided.
文摘In this work, we consider statistical diagnostic for random right censored data based on K-M product limit estimator. Under the definition of K-M product limit estimator, we obtain that the relation formula between estimators. Similar to complete data, we define likelihood displacement and likelihood ratio statistic. Through a real data application, we show that our proposed procedure is validity.
基金Zhou's research was partially supported by the NNSF of China (10471140, 10571169)Wu's research was partially supported by NNSF of China (0571170)
文摘A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations of the kernel smooth estimator are established, and from Bahadur representations the authors can show that this estimator is strongly consistent, asymptotically normal, and weakly convergent.
文摘Let (X0,Y0), be i. i. d nonnegative random vectors with continuous survival distribution function be the product-limit estimator of S(s,t) suggested by Campbell and Foldes (1980). In this paper it is shown that under some conditions a sequence of Gaussian processes Gn(s,t) can be constructed such that sup a. s.,for S,T which together satisfy a certain condition.
文摘In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.
文摘A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error of the estimator are studied.
文摘Based on random left truncated and right censored data we investigate the one-term Edgeworth expansion for the Studentized product-limit estimator, and show that the Edgeworth expansion is close to the exact distribution of the Studentized product-limit estimator with a remainder of On(su-1/2).
基金the Postdoctoral Programme Foundation and the National Natural ScienceFoundation of China(No. 10071092).
文摘For left truncated and right censored data, based on a strong representation of the product-limit estimator of the survival function, we derive the sufficient and necessary condition for the rate of strong uniform convergence of the product-limit estimator over the whole line.
文摘The composite quantile regression should provide estimation efficiency gain over a single quantile regression. In this paper, we extend composite quantile regression to nonparametric model with random censored data. The asymptotic normality of the proposed estimator is established. The proposed methods are applied to the lung cancer data. Extensive simulations are reported, showing that the proposed method works well in practical settings.
文摘Percutaneous vertebroplasty is a minimally invasive procedure that involves filling of a fractured vertebral body with bone cement to relieve pain and to restore the vertebral height. It is a safe and effective treatment and is widely used for treating Osteoporotic Vertebral Compression Fracture. Despite of its beneficial advantages over primary conservative managements, adjacent level vertebral compression fracture remains the challenge for surgeons. Adjacent level vertebral compression fracture following percutaneous vertebroplasty using PMMA cement has been reported as a complication. Numerous risk factors have been reported for the occurrence of new adjacent VCFs after PVP. The multiple level osteoporotic vertebral compression fractures and the increasing age of the patients are directly proportional to the risk of developing new symptomatic adjacent vertebral compression fracture after PVP. Moreover, low BMD and cement leakage are other factors that directly affect the incidence of new symptomatic adjacent vertebral fractures. The aim of this review is to evaluate the adjacent level vertebral compression fracture following percutaneous vertebroplasty on the basis of radiographs, Kaplan-Meier Estimation index and also the factors that lead to adjacent level vertebral compression fractures.
文摘Multiple myeloma (MM) is a type of cancer that remains incurable. In the last decade, most research into MM has focused on investigating the improvement in the therapeutic strategy. Our study assesses the survival probability of 48 patients diagnosed with MM based on parametric and non-parametric techniques. We performed parametric survival analysis and found a well-def- ined probability distribution of the survival time to follow three-parameter lognormal. We then estimated the survival probability and compared it with the commonly used non-parametric Kaplan-Meier survival analysis of the survival times. The comparison of the survival probability estimates of the two methods revealed a better survival probability estimate by the parametric method than the Kaplan-Meier. The parametric survival analysis is more robust and efficient because it is based on a well-defined parametric probabilistic distribution, hence preferred over the non-parametric Kaplan-Meier. This study offers therapeutic significance for further enhancement in the treatment strategy of multiple myeloma cancer.
基金partially supported by National Natural Science Foundation of China (NSFC) (No.70911130018,No.71271128)National Natural Science Funds for Distinguished Young Scholar (No.70825004)+1 种基金Creative Research Groups of China (No.10721101)Shanghai University of Finance and Economics through Project 211Phase III and Shanghai Leading Academic Discipline Project, Project Number: B803
文摘Receiver operating characteristic (ROC) curves are often used to study the two sample problem in medical studies. However, most data in medical studies are censored. Usually a natural estimator is based on the Kaplan-Meier estimator. In this paper we propose a smoothed estimator based on kernel techniques for the ROC curve with censored data. The large sample properties of the smoothed estimator are established. Moreover, deficiency is considered in order to compare the proposed smoothed estimator of the ROC curve with the empirical one based on Kaplan-Meier estimator. It is shown that the smoothed estimator outperforms the direct empirical estimator based on the Kaplan-Meier estimator under the criterion of deficiency. A simulation study is also conducted and a real data is analyzed.
基金This work was supported by Fund of National Natural Science(10171103)of China.
文摘The strong limit results of oscillation modulus of PL-process are established in this paper when the density function is not continuous function for censored data. The rates of convergence of oscillation modulus of PL-process are sharp under week condition. These results can be used to derive laws of the iterated logarithm of random bandwidth kernel estimator and nearest neighborhood estimator of density under continuous conditions of density function being not assumed.
基金supported by the National Natural Science Foundation of China under Grant No.71271128the State Key Program of National Natural Science Foundation of China under Grant No.71331006+4 种基金NCMISKey Laboratory of RCSDSCAS and IRTSHUFEPCSIRT(IRT13077)supported by Graduate Innovation Fund of Shanghai University of Finance and Economics under Grant No.CXJJ-2011-429
文摘It is of great interest to estimate quantile residual lifetime in medical science and many other fields. In survival analysis, Kaplan-Meier(K-M) estimator has been widely used to estimate the survival distribution. However, it is well-known that the K-M estimator is not continuous, thus it can not always be used to calculate quantile residual lifetime. In this paper, the authors propose a kernel smoothing method to give an estimator of quantile residual lifetime. By using modern empirical process techniques, the consistency and the asymptotic normality of the proposed estimator are provided neatly.The authors also present the empirical small sample performances of the estimator. Deficiency is introduced to compare the performance of the proposed estimator with the naive unsmoothed estimator of the quantile residaul lifetime. Further simulation studies indicate that the proposed estimator performs very well.
文摘Effects of many medical procedures appear after a time lag, when a significant change occurs in subjects’ failure rate. This paper focuses on the detection and estimation of such changes which is important for the evaluation and comparison of treatments and prediction of their effects. Unlike the classical change-point model, measurements may still be identically distributed, and the change point is a parameter of their common survival function. Some of the classical change-point detection techniques can still be used but the results are different. Contrary to the classical model, the maximum likelihood estimator of a change point appears consistent, even in presence of nuisance parameters. However, a more efficient procedure can be derived from Kaplan-Meier estimation of the survival function followed by the least-squares estimation of the change point. Strong consistency of these estimation schemes is proved. The finite-sample properties are examined by a Monte Carlo study. Proposed methods are applied to a recent clinical trial of the treatment program for strong drug dependence.
文摘The analysis of survival data is a major focus of statistics. Interval censored data reflect uncertainty as to the exact times the units failed within an interval. This type of data frequently comes from tests or situations where the objects of interest are not constantly monitored. Thus events are known only to have occurred between the two observation periods. Interval censoring has become increasingly common in the areas that produce failure time data. This paper explores the statistical analysis of interval-censored failure time data with applications. Three different data sets, namely Breast Cancer, Hemophilia, and AIDS data were used to illustrate the methods during this study. Both parametric and nonparametric methods of analysis are carried out in this study. Theory and methodology of fitted models for the interval-censored data are described. Fitting of parametric and non-parametric models to three real data sets are considered. Results derived from different methods are presented and also compared.
文摘In cancer survival analysis, it is very frequently to estimate the confidence intervals for survival probabilities.But this calculation is not commonly involve in most popular computer packages, or only one methods of estimation in the packages. In the present Paper, we will describe a microcomputer Program for estimating the confidence intervals of survival probabilities, when the survival functions are estimated using Kaplan-Meier product-limit or life-table method. There are five methods of estimation in the program (SPCI), which are the classical(based on Greenwood's formula of variance of S(ti), Rothman-Wilson, arcsin transformation, log(-Iog) transformation, Iogit transformation methods. Two example analysis are given for testing the performances of the program running.
文摘Lung cancer is one of the leading causes of death worldwide, accounting for an estimated 2.1 million cases in 2018. To analyze the risk factors behind the lung cancer survival, this paper employs two main models: Kaplan-Meier estimator and Cox proportional hazard model [1]. Also, log-rank test and wald test are utilized to test whether a correlation exists or not, which is discussed in detail in later parts of the paper. The aim is to find out the most influential factors for the survival probability of lung cancer patients. To summarize the results, stage of cancer is always a significant factor for lung cancer survival, and time has to be taken into account when analyzing the survival rate of patients in our data sample, which is from TCGA. Future study on lung cancer is also required to make improvement for the treatment of lung cancer, as our data sample might not represent the overall condition of patients diagnosed with lung cancer;also, more appropriate and advanced models should be employed in order to reflect factors that can affect survival rate of patients with lung cancer in detail.
文摘We propose a new nonparametric method for assessing non-inferiority of an experimental therapy compared to a standard of care. The ratio μE/μR of true median survival times is the parameter of interest. This is of considerable interest in clinical trials of generic drugs. We think of the ratio mE/mR of the sample medians as a point estimate of the ratioμE/μR. We use the Fieller-Hinkley distribution of the ratio of two normally distributed random variables to derive an unbiased level-α test of inferiority null hypothesis, which is stated in terms of the ratio μE/μR and a pre-specified fixed non-inferiority margin δ. We also explain how to assess equivalence and non-inferiority using bootstrap equivalent confidence intervals on the ratioμE/μR. The proposed new test does not require the censoring distributions for the two arms to be equal and it does not require the hazard rates to be proportional. If the proportional hazards assumption holds good, the proposed new test is more attractive. We also discuss sample size determination. We claim that our test procedure is simple and attains adequate power for moderate sample sizes. We extend the proposed test procedure to stratified analysis. We propose a “two one-sided tests” approach for assessing equivalence.
文摘A family of tests for the presence of regression effect under proportional and non-proportional hazards models is described. The non-proportional hazards model, although not completely general, is very broad and includes a large number of possibilities. In the absence of restrictions, the regression coefficient, β(t), can be any real function of time. When β(t) = β, we recover the proportional hazards model which can then be taken as a special case of a non-proportional hazards model. We study tests of the null hypothesis;H0:β(t) = 0 for all t against alternatives such as;H1:∫β(t)dF(t) ≠ 0 or H1:β(t) ≠ 0 for some t. In contrast to now classical approaches based on partial likelihood and martingale theory, the development here is based on Brownian motion, Donsker’s theorem and theorems from O’Quigley [1] and Xu and O’Quigley [2]. The usual partial likelihood score test arises as a special case. Large sample theory follows without special arguments, such as the martingale central limit theorem, and is relatively straightforward.