Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric es...Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.展开更多
In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bia...In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bias, variance and the optimal bandwidth of the proposed estimator are investigated. Moreover, the asymptotic normality of the proposed estimator is investigated. The performance of the proposed estimator is tested using simulation study and real data.展开更多
In a survival analysis context, we suggest a new method to estimate the piecewise constant hazard rate model. The method provides an automatic procedure to find the number and location of cut points and to estimate th...In a survival analysis context, we suggest a new method to estimate the piecewise constant hazard rate model. The method provides an automatic procedure to find the number and location of cut points and to estimate the hazard on each cut interval. Estimation is performed through a penalized likelihood using an adaptive ridge procedure. A bootstrap procedure is proposed in order to derive valid statistical inference taking both into account the variability of the estimate and the variability in the choice of the cut points. The new method is applied both to simulated data and to the Mayo Clinic trial on primary biliary cirrhosis. The algorithm implementation is seen to work well and to be of practical relevance.展开更多
Abstract Let F(x) be a distribution function supported on [0,X), with an equilibrium distribution function Fe(x). In this paper we shall study the function $r_e(x)( - {\rm ln}{\overline F}_e ( x ))\prime = {\overline ...Abstract Let F(x) be a distribution function supported on [0,X), with an equilibrium distribution function Fe(x). In this paper we shall study the function $r_e(x)( - {\rm ln}{\overline F}_e ( x ))\prime = {\overline F}( x )/\int_x^\infty {\overline F}( u )du $, which is called the equilibrium hazard rate of F. By the limiting behavior of re(x) we give a criterion to identify F to be heavy-tailed or light-tailed. Two broad classes of heavy-tailed distributions are also introduced and studied.展开更多
Objective: To compare with fiveyear survival after surgery for the 116 breast cancer patients treated at the First Teaching Hospital (FTH) and the 866 breast cancer patients at Hpital du SaintSacrement (HSS). Methods...Objective: To compare with fiveyear survival after surgery for the 116 breast cancer patients treated at the First Teaching Hospital (FTH) and the 866 breast cancer patients at Hpital du SaintSacrement (HSS). Methods: Using Cox regression model, after eliminating the confounders, to develop the comparison of the fiveyear average hazard rates between two hospitals and among the levels of prognostic factors. Results: It has significant difference for the old patients (50 years old or more) between the two hospitals. Conclusion: Tumor size at pathology and involvement of lymph nodes were important prognostic factors.展开更多
Regression models for survival time data involve estimation of the hazard rate as a function of predictor variables and associated slope parameters. An adaptive approach is formulated for such hazard regression modeli...Regression models for survival time data involve estimation of the hazard rate as a function of predictor variables and associated slope parameters. An adaptive approach is formulated for such hazard regression modeling. The hazard rate is modeled using fractional polynomials, that is, linear combinations of products of power transforms of time together with other available predictors. These fractional polynomial models are restricted to generating positive-valued hazard rates and decreasing survival times. Exponentially distributed survival times are a special case. Parameters are estimated using maximum likelihood estimation allowing for right censored survival times. Models are evaluated and compared using likelihood cross-validation (LCV) scores. LCV scores and tolerance parameters are used to control an adaptive search through alternative fractional polynomial hazard rate models to identify effective models for the underlying survival time data. These methods are demonstrated using two different survival time data sets including survival times for lung cancer patients and for multiple myeloma patients. For the lung cancer data, the hazard rate depends distinctly on time. However, controlling for cell type provides a distinct improvement while the hazard rate depends only on cell type and no longer on time. Furthermore, Cox regression is unable to identify a cell type effect. For the multiple myeloma data, the hazard rate also depends distinctly on time. Moreover, consideration of hemoglobin at diagnosis provides a distinct improvement, the hazard rate still depends distinctly on time, and hemoglobin distinctly moderates the effect of time on the hazard rate. These results indicate that adaptive hazard rate modeling can provide unique insights into survival time data.展开更多
Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods trea...Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods treat covariates, either time-invariant or time-varying, as having multiplicative effects while general dependence on time is left un-estimated. An adaptive approach is formulated for analyzing multiple event time data. Conditional hazard rates are modeled in terms of dependence on both time and covariates using fractional polynomials restricted so that the conditional hazard rates are positive-valued and so that excess time probability functions (generalizing survival functions for single event times) are decreasing. Maximum likelihood is used to estimate parameters adjusting for right censored event times. Likelihood cross-validation (LCV) scores are used to compare models. Adaptive searches through alternate conditional hazard rate models are controlled by LCV scores combined with tolerance parameters. These searches identify effective models for the underlying multiple event time data. Conditional hazard regression is demonstrated using data on times between tumor recurrence for bladder cancer patients. Analyses of theory-based models for these data using extensions of Cox regression provide conflicting results on effects to treatment group and the initial number of tumors. On the other hand, fractional polynomial analyses of these theory-based models provide consistent results identifying significant effects to treatment group and initial number of tumors using both model-based and robust empirical tests. Adaptive analyses further identify distinct moderation by group of the effect of tumor order and an additive effect to group after controlling for nonlinear effects to initial number of tumors and tumor order. Results of example analyses indicate that adaptive conditional hazard rate modeling can generate useful insights into multiple event time data.展开更多
Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods trea...Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods treat covariates, either time-invariant or time-varying, as having multiplicative effects while general dependence on time is left un-estimated. An adaptive approach is formulated for analyzing multiple event time data. Conditional hazard rates are modeled in terms of dependence on both time and covariates using fractional polynomials restricted so that the conditional hazard rates are positive-valued and so that excess time probability functions (generalizing survival functions for single event times) are decreasing. Maximum likelihood is used to estimate parameters adjusting for right censored event times. Likelihood cross-validation (LCV) scores are used to compare models. Adaptive searches through alternate conditional hazard rate models are controlled by LCV scores combined with tolerance parameters. These searches identify effective models for the underlying multiple event time data. Conditional hazard regression is demonstrated using data on times between tumor recurrence for bladder cancer patients. Analyses of theory-based models for these data using extensions of Cox regression provide conflicting results on effects to treatment group and the initial number of tumors. On the other hand, fractional polynomial analyses of these theory-based models provide consistent results identifying significant effects to treatment group and initial number of tumors using both model-based and robust empirical tests. Adaptive analyses further identify distinct moderation by group of the effect of tumor order and an additive effect to group after controlling for nonlinear effects to initial number of tumors and tumor order. Results of example analyses indicate that adaptive conditional hazard rate modeling can generate useful insights into multiple event time data.展开更多
A central limit theorem for the integrated square error (ISE) of the kernelhazard rate estimators is obtained based on left truncated and right censored data. Anasymptotic representation of the mean integrated square ...A central limit theorem for the integrated square error (ISE) of the kernelhazard rate estimators is obtained based on left truncated and right censored data. Anasymptotic representation of the mean integrated square error (MISE) for the kernel hazardrate estimators is also presented.展开更多
Functional laws of the iterated logarithm are obtained for cumulative hazard processes in the neighborhood of a fixed point when the data are subject to left truncation and right censorship. On the basis of these resu...Functional laws of the iterated logarithm are obtained for cumulative hazard processes in the neighborhood of a fixed point when the data are subject to left truncation and right censorship. On the basis of these results the exact rates of pointwise almost sure convergence for various types of kernel hazard rate estimators are derived.展开更多
Software reliability model is the tool to measure the software reliability quantitatively. Hazard-Rate model is one of the most popular ones. The purpose of our research is to propose the hazard-rate model considering...Software reliability model is the tool to measure the software reliability quantitatively. Hazard-Rate model is one of the most popular ones. The purpose of our research is to propose the hazard-rate model considering fault level for Open Source Software (OSS). Moreover, we aim to adapt our proposed model to the hazard-rate considering the imperfect debugging environment. We have analyzed the trend of fault severity level by using fault data in Bug Tracking System (BTS) and proposed our model based on the result of analysis. Also, we have shown the numerical example for evaluating the performance of our proposed model. Furthermore, we have extended our proposed model to the hazard-rate considering the imperfect debugging environment and showed numerical example for evaluating the possibility of application. As the result, we found out that performance of our proposed model is better than typical hazard-rate models. Also, we verified the possibility of application of proposed model to hazard-rate model considering imperfect debugging.展开更多
The </span></span><span><span><span style="font-family:"">software reliability model is the stochastic model to measure the software <span>reliability quantitatively....The </span></span><span><span><span style="font-family:"">software reliability model is the stochastic model to measure the software <span>reliability quantitatively. A Hazard-Rate Model is </span></span></span></span><span><span><span style="font-family:"">the </span></span></span><span><span><span style="font-family:"">well</span></span></span><span><span><span style="font-family:"">-</span></span></span><span><span><span style="font-family:"">known one as the</span></span></span><span><span><span style="font-family:""> typical software reliability model. We propose Hazard-Rate Models Consider<span>ing Fault Severity Levels (CFSL) for Open Source Software (OSS). The purpose of </span><span>this research is to </span></span></span></span><span><span><span style="font-family:"">make </span></span></span><span><span><span style="font-family:"">the Hazard-Rate Model considering CFSL adapt to</span></span></span><span><span><span style="font-family:""> </span></span></span><span><span><span style="font-family:"">baseline hazard function and 2 kinds of faults data in Bug Tracking System <span>(BTS)</span></span></span></span><span><span><span style="font-family:"">,</span></span></span><span><span><span style="font-family:""> <i>i.e.</i>, we use the covariate vectors in Cox proportional Hazard-Rate</span></span></span><span><span><span style="font-family:""> Model. Also, <span>we show the numerical examples by evaluating the performance of our pro</span><span>posed model. As the result, we compare the performance of our model with the</span> Hazard-Rate Model CFSL.展开更多
In general,simple subsystems like series or parallel are integrated to produce a complex hybrid system.The reliability of a system is determined by the reliability of its constituent components.It is often extremely d...In general,simple subsystems like series or parallel are integrated to produce a complex hybrid system.The reliability of a system is determined by the reliability of its constituent components.It is often extremely difficult or impossible to get specific information about the component that caused the system to fail.Unknown failure causes are instances in which the actual cause of systemfailure is unknown.On the other side,thanks to current advanced technology based on computers,automation,and simulation,products have become incredibly dependable and trustworthy,and as a result,obtaining failure data for testing such exceptionally reliable items have become a very costly and time-consuming procedure.Therefore,because of its capacity to produce rapid and adequate failure data in a short period of time,accelerated life testing(ALT)is the most utilized approach in the field of product reliability and life testing.Based on progressively hybrid censored(PrHC)data froma three-component parallel series hybrid system that failed to owe to unknown causes,this paper investigates a challenging problem of parameter estimation and reliability assessment under a step stress partially accelerated life-test(SSPALT).Failures of components are considered to follow a power linear hazard rate(PLHR),which can be used when the failure rate displays linear,decreasing,increasing or bathtub failure patterns.The Tempered random variable(TRV)model is considered to reflect the effect of the high stress level used to induce early failure data.The maximum likelihood estimation(MLE)approach is used to estimate the parameters of the PLHR distribution and the acceleration factor.A variance covariance matrix(VCM)is then obtained to construct the approximate confidence intervals(ACIs).In addition,studentized bootstrap confidence intervals(ST-B CIs)are also constructed and compared with ACIs in terms of their respective interval lengths(ILs).Moreover,a simulation study is conducted to demonstrate the performance of the estimation procedures and the methodology discussed in this paper.Finally,real failure data from the air conditioning systems of an airplane is used to illustrate further the performance of the suggested estimation technique.展开更多
文摘Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.
文摘In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bias, variance and the optimal bandwidth of the proposed estimator are investigated. Moreover, the asymptotic normality of the proposed estimator is investigated. The performance of the proposed estimator is tested using simulation study and real data.
文摘In a survival analysis context, we suggest a new method to estimate the piecewise constant hazard rate model. The method provides an automatic procedure to find the number and location of cut points and to estimate the hazard on each cut interval. Estimation is performed through a penalized likelihood using an adaptive ridge procedure. A bootstrap procedure is proposed in order to derive valid statistical inference taking both into account the variability of the estimate and the variability in the choice of the cut points. The new method is applied both to simulated data and to the Mayo Clinic trial on primary biliary cirrhosis. The algorithm implementation is seen to work well and to be of practical relevance.
基金Supported by the National Natural Science Foundation of China (No.10071081) & Special Foundation of USTC.
文摘Abstract Let F(x) be a distribution function supported on [0,X), with an equilibrium distribution function Fe(x). In this paper we shall study the function $r_e(x)( - {\rm ln}{\overline F}_e ( x ))\prime = {\overline F}( x )/\int_x^\infty {\overline F}( u )du $, which is called the equilibrium hazard rate of F. By the limiting behavior of re(x) we give a criterion to identify F to be heavy-tailed or light-tailed. Two broad classes of heavy-tailed distributions are also introduced and studied.
文摘Objective: To compare with fiveyear survival after surgery for the 116 breast cancer patients treated at the First Teaching Hospital (FTH) and the 866 breast cancer patients at Hpital du SaintSacrement (HSS). Methods: Using Cox regression model, after eliminating the confounders, to develop the comparison of the fiveyear average hazard rates between two hospitals and among the levels of prognostic factors. Results: It has significant difference for the old patients (50 years old or more) between the two hospitals. Conclusion: Tumor size at pathology and involvement of lymph nodes were important prognostic factors.
文摘Regression models for survival time data involve estimation of the hazard rate as a function of predictor variables and associated slope parameters. An adaptive approach is formulated for such hazard regression modeling. The hazard rate is modeled using fractional polynomials, that is, linear combinations of products of power transforms of time together with other available predictors. These fractional polynomial models are restricted to generating positive-valued hazard rates and decreasing survival times. Exponentially distributed survival times are a special case. Parameters are estimated using maximum likelihood estimation allowing for right censored survival times. Models are evaluated and compared using likelihood cross-validation (LCV) scores. LCV scores and tolerance parameters are used to control an adaptive search through alternative fractional polynomial hazard rate models to identify effective models for the underlying survival time data. These methods are demonstrated using two different survival time data sets including survival times for lung cancer patients and for multiple myeloma patients. For the lung cancer data, the hazard rate depends distinctly on time. However, controlling for cell type provides a distinct improvement while the hazard rate depends only on cell type and no longer on time. Furthermore, Cox regression is unable to identify a cell type effect. For the multiple myeloma data, the hazard rate also depends distinctly on time. Moreover, consideration of hemoglobin at diagnosis provides a distinct improvement, the hazard rate still depends distinctly on time, and hemoglobin distinctly moderates the effect of time on the hazard rate. These results indicate that adaptive hazard rate modeling can provide unique insights into survival time data.
文摘Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods treat covariates, either time-invariant or time-varying, as having multiplicative effects while general dependence on time is left un-estimated. An adaptive approach is formulated for analyzing multiple event time data. Conditional hazard rates are modeled in terms of dependence on both time and covariates using fractional polynomials restricted so that the conditional hazard rates are positive-valued and so that excess time probability functions (generalizing survival functions for single event times) are decreasing. Maximum likelihood is used to estimate parameters adjusting for right censored event times. Likelihood cross-validation (LCV) scores are used to compare models. Adaptive searches through alternate conditional hazard rate models are controlled by LCV scores combined with tolerance parameters. These searches identify effective models for the underlying multiple event time data. Conditional hazard regression is demonstrated using data on times between tumor recurrence for bladder cancer patients. Analyses of theory-based models for these data using extensions of Cox regression provide conflicting results on effects to treatment group and the initial number of tumors. On the other hand, fractional polynomial analyses of these theory-based models provide consistent results identifying significant effects to treatment group and initial number of tumors using both model-based and robust empirical tests. Adaptive analyses further identify distinct moderation by group of the effect of tumor order and an additive effect to group after controlling for nonlinear effects to initial number of tumors and tumor order. Results of example analyses indicate that adaptive conditional hazard rate modeling can generate useful insights into multiple event time data.
文摘Recurrent event time data and more general multiple event time data are commonly analyzed using extensions of Cox regression, or proportional hazards regression, as used with single event time data. These methods treat covariates, either time-invariant or time-varying, as having multiplicative effects while general dependence on time is left un-estimated. An adaptive approach is formulated for analyzing multiple event time data. Conditional hazard rates are modeled in terms of dependence on both time and covariates using fractional polynomials restricted so that the conditional hazard rates are positive-valued and so that excess time probability functions (generalizing survival functions for single event times) are decreasing. Maximum likelihood is used to estimate parameters adjusting for right censored event times. Likelihood cross-validation (LCV) scores are used to compare models. Adaptive searches through alternate conditional hazard rate models are controlled by LCV scores combined with tolerance parameters. These searches identify effective models for the underlying multiple event time data. Conditional hazard regression is demonstrated using data on times between tumor recurrence for bladder cancer patients. Analyses of theory-based models for these data using extensions of Cox regression provide conflicting results on effects to treatment group and the initial number of tumors. On the other hand, fractional polynomial analyses of these theory-based models provide consistent results identifying significant effects to treatment group and initial number of tumors using both model-based and robust empirical tests. Adaptive analyses further identify distinct moderation by group of the effect of tumor order and an additive effect to group after controlling for nonlinear effects to initial number of tumors and tumor order. Results of example analyses indicate that adaptive conditional hazard rate modeling can generate useful insights into multiple event time data.
文摘A central limit theorem for the integrated square error (ISE) of the kernelhazard rate estimators is obtained based on left truncated and right censored data. Anasymptotic representation of the mean integrated square error (MISE) for the kernel hazardrate estimators is also presented.
基金This research is supported by the National Natural Science Foundation of China.
文摘Functional laws of the iterated logarithm are obtained for cumulative hazard processes in the neighborhood of a fixed point when the data are subject to left truncation and right censorship. On the basis of these results the exact rates of pointwise almost sure convergence for various types of kernel hazard rate estimators are derived.
文摘Software reliability model is the tool to measure the software reliability quantitatively. Hazard-Rate model is one of the most popular ones. The purpose of our research is to propose the hazard-rate model considering fault level for Open Source Software (OSS). Moreover, we aim to adapt our proposed model to the hazard-rate considering the imperfect debugging environment. We have analyzed the trend of fault severity level by using fault data in Bug Tracking System (BTS) and proposed our model based on the result of analysis. Also, we have shown the numerical example for evaluating the performance of our proposed model. Furthermore, we have extended our proposed model to the hazard-rate considering the imperfect debugging environment and showed numerical example for evaluating the possibility of application. As the result, we found out that performance of our proposed model is better than typical hazard-rate models. Also, we verified the possibility of application of proposed model to hazard-rate model considering imperfect debugging.
文摘The </span></span><span><span><span style="font-family:"">software reliability model is the stochastic model to measure the software <span>reliability quantitatively. A Hazard-Rate Model is </span></span></span></span><span><span><span style="font-family:"">the </span></span></span><span><span><span style="font-family:"">well</span></span></span><span><span><span style="font-family:"">-</span></span></span><span><span><span style="font-family:"">known one as the</span></span></span><span><span><span style="font-family:""> typical software reliability model. We propose Hazard-Rate Models Consider<span>ing Fault Severity Levels (CFSL) for Open Source Software (OSS). The purpose of </span><span>this research is to </span></span></span></span><span><span><span style="font-family:"">make </span></span></span><span><span><span style="font-family:"">the Hazard-Rate Model considering CFSL adapt to</span></span></span><span><span><span style="font-family:""> </span></span></span><span><span><span style="font-family:"">baseline hazard function and 2 kinds of faults data in Bug Tracking System <span>(BTS)</span></span></span></span><span><span><span style="font-family:"">,</span></span></span><span><span><span style="font-family:""> <i>i.e.</i>, we use the covariate vectors in Cox proportional Hazard-Rate</span></span></span><span><span><span style="font-family:""> Model. Also, <span>we show the numerical examples by evaluating the performance of our pro</span><span>posed model. As the result, we compare the performance of our model with the</span> Hazard-Rate Model CFSL.
文摘In general,simple subsystems like series or parallel are integrated to produce a complex hybrid system.The reliability of a system is determined by the reliability of its constituent components.It is often extremely difficult or impossible to get specific information about the component that caused the system to fail.Unknown failure causes are instances in which the actual cause of systemfailure is unknown.On the other side,thanks to current advanced technology based on computers,automation,and simulation,products have become incredibly dependable and trustworthy,and as a result,obtaining failure data for testing such exceptionally reliable items have become a very costly and time-consuming procedure.Therefore,because of its capacity to produce rapid and adequate failure data in a short period of time,accelerated life testing(ALT)is the most utilized approach in the field of product reliability and life testing.Based on progressively hybrid censored(PrHC)data froma three-component parallel series hybrid system that failed to owe to unknown causes,this paper investigates a challenging problem of parameter estimation and reliability assessment under a step stress partially accelerated life-test(SSPALT).Failures of components are considered to follow a power linear hazard rate(PLHR),which can be used when the failure rate displays linear,decreasing,increasing or bathtub failure patterns.The Tempered random variable(TRV)model is considered to reflect the effect of the high stress level used to induce early failure data.The maximum likelihood estimation(MLE)approach is used to estimate the parameters of the PLHR distribution and the acceleration factor.A variance covariance matrix(VCM)is then obtained to construct the approximate confidence intervals(ACIs).In addition,studentized bootstrap confidence intervals(ST-B CIs)are also constructed and compared with ACIs in terms of their respective interval lengths(ILs).Moreover,a simulation study is conducted to demonstrate the performance of the estimation procedures and the methodology discussed in this paper.Finally,real failure data from the air conditioning systems of an airplane is used to illustrate further the performance of the suggested estimation technique.