Consider the regression model Y=Xβ+ g(T) + e. Here g is an unknown smoothing function on [0, 1], β is a l-dimensional parameter to be estimated, and e is an unobserved error. When data are randomly censored, the est...Consider the regression model Y=Xβ+ g(T) + e. Here g is an unknown smoothing function on [0, 1], β is a l-dimensional parameter to be estimated, and e is an unobserved error. When data are randomly censored, the estimators βn* and gn*forβ and g are obtained by using class K and the least square methods. It is shown that βn* is asymptotically normal and gn* achieves the convergent rate O(n-1/3).展开更多
Yu et al. (2012) considered a certain dependent right censorship model. We show that this model is equivalent to the independent right censorship model, extending a result with continuity restriction in Williams and L...Yu et al. (2012) considered a certain dependent right censorship model. We show that this model is equivalent to the independent right censorship model, extending a result with continuity restriction in Williams and Lagakos (1977). Then the asymptotic normality of the product limit estimator under the dependent right censorship model follows from the existing results in the literature under the independent right censorship model, and thus partially solves an open problem in the literature.展开更多
A novel inverted generalized gamma(IGG)distribution,proposed for data modelling with an upside-down bathtub hazard rate,is considered.In many real-world practical situations,when a researcher wants to conduct a compar...A novel inverted generalized gamma(IGG)distribution,proposed for data modelling with an upside-down bathtub hazard rate,is considered.In many real-world practical situations,when a researcher wants to conduct a comparative study of the life testing of items based on cost and duration of testing,censoring strategies are frequently used.From this point of view,in the presence of censored data compiled from the most well-known progressively Type-Ⅱ censoring technique,this study examines different parameters of the IGG distribution.From a classical point of view,the likelihood and product of spacing estimation methods are considered.Observed Fisher information and the deltamethod are used to obtain the approximate confidence intervals for any unknown parametric function of the suggestedmodel.In the Bayesian paradigm,the same traditional inferential approaches are used to estimate all unknown subjects.Markov-Chain with Monte-Carlo steps are considered to approximate all Bayes’findings.Extensive numerical comparisons are presented to examine the performance of the proposed methodologies using various criteria of accuracy.Further,using several optimality criteria,the optimumprogressive censoring design is suggested.To highlight how the proposed estimators can be used in practice and to verify the flexibility of the proposed model,we analyze the failure times of twenty mechanical components of a diesel engine.展开更多
A new one-parameter Chris-Jerry distribution,created by mixing exponential and gamma distributions,is discussed in this article in the presence of incomplete lifetime data.We examine a novel generalized progressively ...A new one-parameter Chris-Jerry distribution,created by mixing exponential and gamma distributions,is discussed in this article in the presence of incomplete lifetime data.We examine a novel generalized progressively hybrid censoring technique that ensures the experiment ends at a predefined period when the model of the test participants has a Chris-Jerry(CJ)distribution.When the indicated censored data is present,Bayes and likelihood estimations are used to explore the CJ parameter and reliability indices,including the hazard rate and reliability functions.We acquire the estimated asymptotic and credible confidence intervals of each unknown quantity.Additionally,via the squared-error loss,the Bayes’estimators are obtained using gamma prior.The Bayes estimators cannot be expressed theoretically since the likelihood density is created in a complex manner;nonetheless,Markov-chain Monte Carlo techniques can be used to evaluate them.The effectiveness of the investigated estimations is assessed,and some recommendations are given using Monte Carlo results.Ultimately,an analysis of two engineering applications,such as mechanical equipment and ball bearing data sets,shows the applicability of the proposed approaches that may be used in real-world settings.展开更多
A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied...A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied.In this research, using a progressive Type-II censored, various inferences of the MOL model parameters oflife are introduced. Utilizing the maximum likelihood method as a classical approach, the estimators of themodel parameters and various reliability measures are investigated. Against both symmetric and asymmetric lossfunctions, the Bayesian estimates are obtained using the Markov Chain Monte Carlo (MCMC) technique with theassumption of independent gamma priors. From the Fisher information data and the simulatedMarkovian chains,the approximate asymptotic interval and the highest posterior density interval, respectively, of each unknownparameter are calculated. Via an extensive simulated study, the usefulness of the various suggested strategies isassessedwith respect to some evaluationmetrics such as mean squared errors, mean relative absolute biases, averageconfidence lengths, and coverage percentages. Comparing the Bayesian estimations based on the asymmetric lossfunction to the traditional technique or the symmetric loss function-based Bayesian estimations, the analysisdemonstrates that asymmetric loss function-based Bayesian estimations are preferred. Finally, two data sets,representing vinyl chloride and repairable mechanical equipment items, have been investigated to support theapproaches proposed and show the superiority of the proposed model compared to the other fourteen lifetimemodels.展开更多
This article introduces a novel variant of the generalized linear exponential(GLE)distribution,known as the sine generalized linear exponential(SGLE)distribution.The SGLE distribution utilizes the sine transformation ...This article introduces a novel variant of the generalized linear exponential(GLE)distribution,known as the sine generalized linear exponential(SGLE)distribution.The SGLE distribution utilizes the sine transformation to enhance its capabilities.The updated distribution is very adaptable and may be efficiently used in the modeling of survival data and dependability issues.The suggested model incorporates a hazard rate function(HRF)that may display a rising,J-shaped,or bathtub form,depending on its unique characteristics.This model includes many well-known lifespan distributions as separate sub-models.The suggested model is accompanied with a range of statistical features.The model parameters are examined using the techniques of maximum likelihood and Bayesian estimation using progressively censored data.In order to evaluate the effectiveness of these techniques,we provide a set of simulated data for testing purposes.The relevance of the newly presented model is shown via two real-world dataset applications,highlighting its superiority over other respected similar models.展开更多
The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software w...The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.展开更多
This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts servi...This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts serving, it serves all customers in the queue in a single batch, which is the so-called batch service. If a new customer or a retrial customer finds all the customers’ rooms are occupied, he will decide whether or not to join the retrial orbit. By using the censoring technique and the matrix analysis method, we first obtain the decay function of the stationary distribution for the quantity of customers in the retrial orbit and the quantity of customers in the queue. Then based on the form of decay rate function and the Karamata Tauberian theorem, we finally get the exact tail asymptotics of the stationary distribution.展开更多
This paper aims to explore the application of Extreme Value Theory (EVT) in estimating the conditional extreme quantile for time-to-event outcomes by examining the functional relationship between ambulatory blood pres...This paper aims to explore the application of Extreme Value Theory (EVT) in estimating the conditional extreme quantile for time-to-event outcomes by examining the functional relationship between ambulatory blood pressure trajectories and clinical outcomes in stroke patients. The study utilizes EVT to analyze the functional connection between ambulatory blood pressure trajectories and clinical outcomes in a sample of 297 stroke patients. The 24-hour ambulatory blood pressure measurement curves for every 15 minutes are considered, acknowledging a censored rate of 40%. The findings reveal that the sample mean excess function exhibits a positive gradient above a specific threshold, confirming the heavy-tailed distribution of data in stroke patients with a positive extreme value index. Consequently, the estimated conditional extreme quantile indicates that stroke patients with higher blood pressure measurements face an elevated risk of recurrent stroke occurrence at an early stage. This research contributes to the understanding of the relationship between ambulatory blood pressure and recurrent stroke, providing valuable insights for clinical considerations and potential interventions in stroke management.展开更多
In this paper, the estimation of joint distribution F(y,z) of (Y, Z) and the estimation in thelinear regression model Y = b′Z + ε for complete data are extended to that of the right censored data. Theregression para...In this paper, the estimation of joint distribution F(y,z) of (Y, Z) and the estimation in thelinear regression model Y = b′Z + ε for complete data are extended to that of the right censored data. Theregression parameter estimates of b and the variance of ε are weighted least square estimates with randomweights. The central limit theorems of the estimators are obtained under very weak conditions and the derivedasymptotic variance has a very simple form.展开更多
The random weighting method is an emerging computing method in statistics.In this paper,we propose a novel estimation of the survival function for right censored data based on the random weighting method.Under some re...The random weighting method is an emerging computing method in statistics.In this paper,we propose a novel estimation of the survival function for right censored data based on the random weighting method.Under some regularity conditions,we prove the strong consistency of this estimation.展开更多
A new extended exponential lifetime model called Harris extended-exponential(HEE)distribution for data modelling with increasing and decreasing hazard rate shapes has been considered.In the reliability context,researc...A new extended exponential lifetime model called Harris extended-exponential(HEE)distribution for data modelling with increasing and decreasing hazard rate shapes has been considered.In the reliability context,researchers prefer to use censoring plans to collect data in order to achieve a compromise between total test time and/or test sample size.So,this study considers both maximum likelihood and Bayesian estimates of the Harris extended-exponential distribution parameters and some of its reliability indices using a progressive Type-II censoring strategy.Under the premise of independent gamma priors,the Bayesian estimation is created using the squared-error and general entropy loss functions.Due to the challenging form of the joint posterior distribution,to evaluate the Bayes estimates,samples from the full conditional distributions are generated using Markov Chain Monte Carlo techniques.For each unknown parameter,the highest posterior density credible intervals and asymptotic confidence intervals are also determined.Through a simulated study,the usefulness of the various suggested strategies is assessed.The optimal progressive censoring plans are also shown,and various optimality criteria are investigated.Two actual data sets,taken from engineering and veterinary medicine areas,are analyzed to show how the offered point and interval estimators can be used in practice and to verify that the proposed model furnishes a good fit than other lifetimemodels:alpha power exponential,generalized-exponential,Nadarajah-Haghighi,Weibull,Lomax,gamma and exponential distributions.Numerical evaluations revealed that in the presence of progressively Type-II censored data,the Bayes estimation method against the squared-error(symmetric)loss is advised for getting the point and interval estimates of the HEE distribution.展开更多
Firstly, the maximum likelihood estimate and asymptotic confidence interval of the unkown parameter for the Topp-Leone distribution are obtained under Type-I left censored samples, furthermore, the asymptotic confiden...Firstly, the maximum likelihood estimate and asymptotic confidence interval of the unkown parameter for the Topp-Leone distribution are obtained under Type-I left censored samples, furthermore, the asymptotic confidence interval of reliability function is obtained based on monotonicity. Secondly, under different loss functions, the Bayesian estimates of the unkown parameter and reliability function are obtained, and the expected mean square errors of Bayesian estimates are calculated. Monte-Carlo method is used to calculate the mean values and relative errors of the estimates. Finally, an example of life data is analyzed by using the statistical method in this paper.展开更多
The supervisory system and the examination system are two indigenous political systems of China,and the former has a longer history than the latter when it comes to the origin.Having inherited the essence of the super...The supervisory system and the examination system are two indigenous political systems of China,and the former has a longer history than the latter when it comes to the origin.Having inherited the essence of the supervisory system since the Qin Dynasty,the supervisory rules in the Ming Dynasty opened a new chapter of legal thoughts of monitoring.This paper started with the design of the supervisory institutions in the Ming Dynasty recorded in the historical materials such as the Memoir of Ming Dynasty and The Interpretive Supplements to"The Great Learning",to get a glimpse of the main content of the legal thoughts of supervisory at that time,and tried to"take history as a mirror"to provide insights and lessons of the legal thoughts of supervisory in the Ming Dynasty for the later generations.展开更多
The estimation of generalized exponential distribution based on progressive censoring with binomial removals is presented, where the number of units removed at each failure time follows a binomial distribution. Maximu...The estimation of generalized exponential distribution based on progressive censoring with binomial removals is presented, where the number of units removed at each failure time follows a binomial distribution. Maximum likelihood estimators of the parameters and their confidence intervals are derived. The expected time required to complete the life test under this censoring scheme is investigated. Finally, the numerical examples are given to illustrate some theoretical results by means of Monte-Carlo simulation.展开更多
In this paper, we have discussed a random censoring test with incomplete information, and proved that the maximum likelihood estimator(MLE) of the parameter based on the randomly censored data with incomplete informat...In this paper, we have discussed a random censoring test with incomplete information, and proved that the maximum likelihood estimator(MLE) of the parameter based on the randomly censored data with incomplete information in the case of the exponential distribution has the strong consistency.展开更多
文摘Consider the regression model Y=Xβ+ g(T) + e. Here g is an unknown smoothing function on [0, 1], β is a l-dimensional parameter to be estimated, and e is an unobserved error. When data are randomly censored, the estimators βn* and gn*forβ and g are obtained by using class K and the least square methods. It is shown that βn* is asymptotically normal and gn* achieves the convergent rate O(n-1/3).
文摘Yu et al. (2012) considered a certain dependent right censorship model. We show that this model is equivalent to the independent right censorship model, extending a result with continuity restriction in Williams and Lagakos (1977). Then the asymptotic normality of the product limit estimator under the dependent right censorship model follows from the existing results in the literature under the independent right censorship model, and thus partially solves an open problem in the literature.
基金funded by the Deanship of Scientific Research and Libraries,Princess Nourah bint Abdulrahman University,through the Program of Research Project Funding after Publication,Grant No.(RPFAP-34-1445).
文摘A novel inverted generalized gamma(IGG)distribution,proposed for data modelling with an upside-down bathtub hazard rate,is considered.In many real-world practical situations,when a researcher wants to conduct a comparative study of the life testing of items based on cost and duration of testing,censoring strategies are frequently used.From this point of view,in the presence of censored data compiled from the most well-known progressively Type-Ⅱ censoring technique,this study examines different parameters of the IGG distribution.From a classical point of view,the likelihood and product of spacing estimation methods are considered.Observed Fisher information and the deltamethod are used to obtain the approximate confidence intervals for any unknown parametric function of the suggestedmodel.In the Bayesian paradigm,the same traditional inferential approaches are used to estimate all unknown subjects.Markov-Chain with Monte-Carlo steps are considered to approximate all Bayes’findings.Extensive numerical comparisons are presented to examine the performance of the proposed methodologies using various criteria of accuracy.Further,using several optimality criteria,the optimumprogressive censoring design is suggested.To highlight how the proposed estimators can be used in practice and to verify the flexibility of the proposed model,we analyze the failure times of twenty mechanical components of a diesel engine.
基金This research was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2024R50)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘A new one-parameter Chris-Jerry distribution,created by mixing exponential and gamma distributions,is discussed in this article in the presence of incomplete lifetime data.We examine a novel generalized progressively hybrid censoring technique that ensures the experiment ends at a predefined period when the model of the test participants has a Chris-Jerry(CJ)distribution.When the indicated censored data is present,Bayes and likelihood estimations are used to explore the CJ parameter and reliability indices,including the hazard rate and reliability functions.We acquire the estimated asymptotic and credible confidence intervals of each unknown quantity.Additionally,via the squared-error loss,the Bayes’estimators are obtained using gamma prior.The Bayes estimators cannot be expressed theoretically since the likelihood density is created in a complex manner;nonetheless,Markov-chain Monte Carlo techniques can be used to evaluate them.The effectiveness of the investigated estimations is assessed,and some recommendations are given using Monte Carlo results.Ultimately,an analysis of two engineering applications,such as mechanical equipment and ball bearing data sets,shows the applicability of the proposed approaches that may be used in real-world settings.
文摘A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied.In this research, using a progressive Type-II censored, various inferences of the MOL model parameters oflife are introduced. Utilizing the maximum likelihood method as a classical approach, the estimators of themodel parameters and various reliability measures are investigated. Against both symmetric and asymmetric lossfunctions, the Bayesian estimates are obtained using the Markov Chain Monte Carlo (MCMC) technique with theassumption of independent gamma priors. From the Fisher information data and the simulatedMarkovian chains,the approximate asymptotic interval and the highest posterior density interval, respectively, of each unknownparameter are calculated. Via an extensive simulated study, the usefulness of the various suggested strategies isassessedwith respect to some evaluationmetrics such as mean squared errors, mean relative absolute biases, averageconfidence lengths, and coverage percentages. Comparing the Bayesian estimations based on the asymmetric lossfunction to the traditional technique or the symmetric loss function-based Bayesian estimations, the analysisdemonstrates that asymmetric loss function-based Bayesian estimations are preferred. Finally, two data sets,representing vinyl chloride and repairable mechanical equipment items, have been investigated to support theapproaches proposed and show the superiority of the proposed model compared to the other fourteen lifetimemodels.
基金This work was supported and funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(Grant Number IMSIU-RG23142).
文摘This article introduces a novel variant of the generalized linear exponential(GLE)distribution,known as the sine generalized linear exponential(SGLE)distribution.The SGLE distribution utilizes the sine transformation to enhance its capabilities.The updated distribution is very adaptable and may be efficiently used in the modeling of survival data and dependability issues.The suggested model incorporates a hazard rate function(HRF)that may display a rising,J-shaped,or bathtub form,depending on its unique characteristics.This model includes many well-known lifespan distributions as separate sub-models.The suggested model is accompanied with a range of statistical features.The model parameters are examined using the techniques of maximum likelihood and Bayesian estimation using progressively censored data.In order to evaluate the effectiveness of these techniques,we provide a set of simulated data for testing purposes.The relevance of the newly presented model is shown via two real-world dataset applications,highlighting its superiority over other respected similar models.
文摘The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.
文摘This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts serving, it serves all customers in the queue in a single batch, which is the so-called batch service. If a new customer or a retrial customer finds all the customers’ rooms are occupied, he will decide whether or not to join the retrial orbit. By using the censoring technique and the matrix analysis method, we first obtain the decay function of the stationary distribution for the quantity of customers in the retrial orbit and the quantity of customers in the queue. Then based on the form of decay rate function and the Karamata Tauberian theorem, we finally get the exact tail asymptotics of the stationary distribution.
文摘This paper aims to explore the application of Extreme Value Theory (EVT) in estimating the conditional extreme quantile for time-to-event outcomes by examining the functional relationship between ambulatory blood pressure trajectories and clinical outcomes in stroke patients. The study utilizes EVT to analyze the functional connection between ambulatory blood pressure trajectories and clinical outcomes in a sample of 297 stroke patients. The 24-hour ambulatory blood pressure measurement curves for every 15 minutes are considered, acknowledging a censored rate of 40%. The findings reveal that the sample mean excess function exhibits a positive gradient above a specific threshold, confirming the heavy-tailed distribution of data in stroke patients with a positive extreme value index. Consequently, the estimated conditional extreme quantile indicates that stroke patients with higher blood pressure measurements face an elevated risk of recurrent stroke occurrence at an early stage. This research contributes to the understanding of the relationship between ambulatory blood pressure and recurrent stroke, providing valuable insights for clinical considerations and potential interventions in stroke management.
基金This work was supported by the National Natural Science Foundation of China(Grant No.19971006)and RFDP.
文摘In this paper, the estimation of joint distribution F(y,z) of (Y, Z) and the estimation in thelinear regression model Y = b′Z + ε for complete data are extended to that of the right censored data. Theregression parameter estimates of b and the variance of ε are weighted least square estimates with randomweights. The central limit theorems of the estimators are obtained under very weak conditions and the derivedasymptotic variance has a very simple form.
文摘The random weighting method is an emerging computing method in statistics.In this paper,we propose a novel estimation of the survival function for right censored data based on the random weighting method.Under some regularity conditions,we prove the strong consistency of this estimation.
基金This research was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2023R175),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘A new extended exponential lifetime model called Harris extended-exponential(HEE)distribution for data modelling with increasing and decreasing hazard rate shapes has been considered.In the reliability context,researchers prefer to use censoring plans to collect data in order to achieve a compromise between total test time and/or test sample size.So,this study considers both maximum likelihood and Bayesian estimates of the Harris extended-exponential distribution parameters and some of its reliability indices using a progressive Type-II censoring strategy.Under the premise of independent gamma priors,the Bayesian estimation is created using the squared-error and general entropy loss functions.Due to the challenging form of the joint posterior distribution,to evaluate the Bayes estimates,samples from the full conditional distributions are generated using Markov Chain Monte Carlo techniques.For each unknown parameter,the highest posterior density credible intervals and asymptotic confidence intervals are also determined.Through a simulated study,the usefulness of the various suggested strategies is assessed.The optimal progressive censoring plans are also shown,and various optimality criteria are investigated.Two actual data sets,taken from engineering and veterinary medicine areas,are analyzed to show how the offered point and interval estimators can be used in practice and to verify that the proposed model furnishes a good fit than other lifetimemodels:alpha power exponential,generalized-exponential,Nadarajah-Haghighi,Weibull,Lomax,gamma and exponential distributions.Numerical evaluations revealed that in the presence of progressively Type-II censored data,the Bayes estimation method against the squared-error(symmetric)loss is advised for getting the point and interval estimates of the HEE distribution.
基金Supported by National Natural Science Foundation of China(Grant No.11901058).
文摘Firstly, the maximum likelihood estimate and asymptotic confidence interval of the unkown parameter for the Topp-Leone distribution are obtained under Type-I left censored samples, furthermore, the asymptotic confidence interval of reliability function is obtained based on monotonicity. Secondly, under different loss functions, the Bayesian estimates of the unkown parameter and reliability function are obtained, and the expected mean square errors of Bayesian estimates are calculated. Monte-Carlo method is used to calculate the mean values and relative errors of the estimates. Finally, an example of life data is analyzed by using the statistical method in this paper.
基金Scientific Planning Project of Tianjin Philosophy and Social Science Monitoring the legal system and political trend-Research on the relationship between the Duchayuan during Tianqi period and the political situation of the late Ming DynastyProject No.TJFX19-002。
文摘The supervisory system and the examination system are two indigenous political systems of China,and the former has a longer history than the latter when it comes to the origin.Having inherited the essence of the supervisory system since the Qin Dynasty,the supervisory rules in the Ming Dynasty opened a new chapter of legal thoughts of monitoring.This paper started with the design of the supervisory institutions in the Ming Dynasty recorded in the historical materials such as the Memoir of Ming Dynasty and The Interpretive Supplements to"The Great Learning",to get a glimpse of the main content of the legal thoughts of supervisory at that time,and tried to"take history as a mirror"to provide insights and lessons of the legal thoughts of supervisory in the Ming Dynasty for the later generations.
基金supported by the National Natural Science Foundation of China(70471057)
文摘The estimation of generalized exponential distribution based on progressive censoring with binomial removals is presented, where the number of units removed at each failure time follows a binomial distribution. Maximum likelihood estimators of the parameters and their confidence intervals are derived. The expected time required to complete the life test under this censoring scheme is investigated. Finally, the numerical examples are given to illustrate some theoretical results by means of Monte-Carlo simulation.
文摘In this paper, we have discussed a random censoring test with incomplete information, and proved that the maximum likelihood estimator(MLE) of the parameter based on the randomly censored data with incomplete information in the case of the exponential distribution has the strong consistency.