New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aimin...New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.展开更多
Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. Th...Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. The minimum negative log-likelihood function was used as the objective function to optimize instead of using iterative method to solve complex system of equations,and the problem of parameter estimation of improved NHPP model was solved by immune clone algorithm. And the interval estimation of reliability indices was given by using fisher information matrix method and delta method. An example of failure truncated data from multiple numerical control( NC) machine tools was taken to prove the method. and the results show that the algorithm has a higher convergence rate and computational accuracy, which demonstrates the feasibility of the method.展开更多
In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. ...In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.展开更多
We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson p...We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.展开更多
This article discusses the Bayesian approach for count data using non-homogeneous Poisson processes, considering different prior distributions for the model parameters. A Bayesian approach using Markov Chain Monte Car...This article discusses the Bayesian approach for count data using non-homogeneous Poisson processes, considering different prior distributions for the model parameters. A Bayesian approach using Markov Chain Monte Carlo (MCMC) simulation methods for this model was first introduced by [1], taking into account software reliability data and considering non-informative prior distributions for the parameters of the model. With the non-informative prior distributions presented by these authors, computational difficulties may occur when using MCMC methods. This article considers different prior distributions for the parameters of the proposed model, and studies the effect of such prior distributions on the convergence and accuracy of the results. In order to illustrate the proposed methodology, two examples are considered: the first one has simulated data, and the second has a set of data for pollution issues at a region in Mexico City.展开更多
The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of ...The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of the study that has been done on the Goel-Okumoto software reliability model is parameter estimation using the MLE method and model fit. It is widely known that predictive analysis is very useful for modifying, debugging and determining when to terminate software development testing process. However, there is a conspicuous absence of literature on both the classical and Bayesian predictive analyses on the model. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model. Driven by the requirement of highly reliable software used in computers embedded in automotive, mechanical and safety control systems, industrial and quality process control, real-time sensor networks, aircrafts, nuclear reactors among others, we address four issues in single-sample prediction associated closely with software development process. We have adopted Bayesian methods based on non-informative priors to develop explicit solutions to these problems. An example with real data in the form of time between software failures will be used to illustrate the developed methodologies.展开更多
The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHP...The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHPP model as it describes exponential software failure curve. Parameter estimation, model fit and predictive analyses based on one sample have been conducted on the Goel-Okumoto software reliability model. However, predictive analyses based on two samples have not been conducted on the model. In two-sample prediction, the parameters and characteristics of the first sample are used to analyze and to make predictions for the second sample. This helps in saving time and resources during the software development process. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model based on two samples. We have addressed three issues in two-sample prediction associated closely with software development testing process. Bayesian methods based on non-informative priors have been adopted to develop solutions to these issues. The developed methodologies have been illustrated by two sets of software failure data simulated from the Goel-Okumoto software reliability model.展开更多
The current-mode-counting method is a new approach to observing transient processes,especially in transient nuclear fusion,based on the non-homogeneous Poisson process(NHPP)model.In this paper,a new measurement proces...The current-mode-counting method is a new approach to observing transient processes,especially in transient nuclear fusion,based on the non-homogeneous Poisson process(NHPP)model.In this paper,a new measurement process model of the pulsed radiation field produced by transient nuclear fusion is built based on the NHPP.A simulated measurement is performed using the model,and the current signal from the detector is obtained by simulation based on Poisson process thinning.The neutron time spectrum is reconstructed and is in good agreement with the theoretical value,with its maximum error of a characteristic parameter less than 2.3%.Verification experiments were carried out on a CPNG-6 device at the China Institute of Atomic Energy,with a detection system with a nanosecond response time.The experimental charge amplitude spectra are in good agreement with those obtained by the traditional counting mode,and the characteristic parameters of the time spectrum are in good agreement with the theoretical values.This shows that the current-mode-counting method is effective for the observation of transient nuclear fusion processes.展开更多
Negative Poisson’s ratio(NPR)metamaterials are attractive for their unique mechanical behaviors and potential applications in deformation control and energy absorption.However,when subjected to significant stretching...Negative Poisson’s ratio(NPR)metamaterials are attractive for their unique mechanical behaviors and potential applications in deformation control and energy absorption.However,when subjected to significant stretching,NPR metamaterials designed under small strain assumption may experience a rapid degradation in NPR performance.To address this issue,this study aims to design metamaterials maintaining a targeted NPR under large deformation by taking advantage of the geometry nonlinearity mechanism.A representative periodic unit cell is modeled considering geometry nonlinearity,and its topology is designed using a gradient-free method.The unit cell microstructural topologies are described with the material-field series-expansion(MFSE)method.The MFSE method assumes spatial correlation of the material distribution,which greatly reduces the number of required design variables.To conveniently design metamaterials with desired NPR under large deformation,we propose a two-stage gradient-free metamaterial topology optimization method,which fully takes advantage of the dimension reduction benefits of the MFSE method and the Kriging surrogate model technique.Initially,we use homogenization to find a preliminary NPR design under a small deformation assumption.In the second stage,we begin with this preliminary design and minimize deviations in NPR from a targeted value under large deformation.Using this strategy and solution technique,we successfully obtain a group of NPR metamaterials that can sustain different desired NPRs in the range of[−0.8,−0.1]under uniaxial stretching up to 20% strain.Furthermore,typical microstructure designs are fabricated and tested through experiments.The experimental results show good consistency with our numerical results,demonstrating the effectiveness of the present gradientfree NPR metamaterial design strategy.展开更多
Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to spe...Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to specific data ranges with an average absolute percentage relative error(AAPRE)of more than 10%.The published gated recurrent unit(GRU)models do not consider trend analysis to show physical behaviors.In this study,we aim to develop a GRU model using trend analysis and three inputs for predicting n s based on a broad range of data,n s(value of 0.1627-0.4492),bulk formation density(RHOB)(0.315-2.994 g/mL),compressional time(DTc)(44.43-186.9 μs/ft),and shear time(DTs)(72.9-341.2μ s/ft).The GRU model was evaluated using different approaches,including statistical error an-alyses.The GRU model showed the proper trends,and the model data ranges were wider than previous ones.The GRU model has the largest correlation coefficient(R)of 0.967 and the lowest AAPRE,average percent relative error(APRE),root mean square error(RMSE),and standard deviation(SD)of 3.228%,1.054%,4.389,and 0.013,respectively,compared to other models.The GRU model has a high accuracy for the different datasets:training,validation,testing,and the whole datasets with R and AAPRE values were 0.981 and 2.601%,0.966 and 3.274%,0.967 and 3.228%,and 0.977 and 2.861%,respectively.The group error analyses of all inputs show that the GRU model has less than 5% AAPRE for all input ranges,which is superior to other models that have different AAPRE values of more than 10% at various ranges of inputs.展开更多
基金supported by Sustentation Program of National Ministries and Commissions of China (Grant No. 51319030302 and Grant No. 9140A19030506KG0166)
文摘New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
基金National CNC Special Project,China(No.2010ZX04001-032)the Youth Science and Technology Foundation of Gansu Province,China(No.145RJYA307)
文摘Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. The minimum negative log-likelihood function was used as the objective function to optimize instead of using iterative method to solve complex system of equations,and the problem of parameter estimation of improved NHPP model was solved by immune clone algorithm. And the interval estimation of reliability indices was given by using fisher information matrix method and delta method. An example of failure truncated data from multiple numerical control( NC) machine tools was taken to prove the method. and the results show that the algorithm has a higher convergence rate and computational accuracy, which demonstrates the feasibility of the method.
文摘In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.
基金financially supported by the project PAPIIT number IN104110-3 of the Direccion General de Apoyo al Personal Academico of the Universidad Nacional Autonoma de Mexico,Mexico,and is part of JMB’s Ph.D.partially funded by the Consejo Nacional de Ciencias y Tecnologia,Mexico,through the Ph.D.Scholarship number 210347JAA was partially funded by the Conselho Nacional de Pesquisa,Brazil,grant number 300235/2005-4.
文摘We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.
基金partially supported by grants from Capes,CNPq and FAPESP.
文摘This article discusses the Bayesian approach for count data using non-homogeneous Poisson processes, considering different prior distributions for the model parameters. A Bayesian approach using Markov Chain Monte Carlo (MCMC) simulation methods for this model was first introduced by [1], taking into account software reliability data and considering non-informative prior distributions for the parameters of the model. With the non-informative prior distributions presented by these authors, computational difficulties may occur when using MCMC methods. This article considers different prior distributions for the parameters of the proposed model, and studies the effect of such prior distributions on the convergence and accuracy of the results. In order to illustrate the proposed methodology, two examples are considered: the first one has simulated data, and the second has a set of data for pollution issues at a region in Mexico City.
文摘The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of the study that has been done on the Goel-Okumoto software reliability model is parameter estimation using the MLE method and model fit. It is widely known that predictive analysis is very useful for modifying, debugging and determining when to terminate software development testing process. However, there is a conspicuous absence of literature on both the classical and Bayesian predictive analyses on the model. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model. Driven by the requirement of highly reliable software used in computers embedded in automotive, mechanical and safety control systems, industrial and quality process control, real-time sensor networks, aircrafts, nuclear reactors among others, we address four issues in single-sample prediction associated closely with software development process. We have adopted Bayesian methods based on non-informative priors to develop explicit solutions to these problems. An example with real data in the form of time between software failures will be used to illustrate the developed methodologies.
文摘The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHPP model as it describes exponential software failure curve. Parameter estimation, model fit and predictive analyses based on one sample have been conducted on the Goel-Okumoto software reliability model. However, predictive analyses based on two samples have not been conducted on the model. In two-sample prediction, the parameters and characteristics of the first sample are used to analyze and to make predictions for the second sample. This helps in saving time and resources during the software development process. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model based on two samples. We have addressed three issues in two-sample prediction associated closely with software development testing process. Bayesian methods based on non-informative priors have been adopted to develop solutions to these issues. The developed methodologies have been illustrated by two sets of software failure data simulated from the Goel-Okumoto software reliability model.
基金National Natural Science Foundation of China(1435010,11575145,11922507)。
文摘The current-mode-counting method is a new approach to observing transient processes,especially in transient nuclear fusion,based on the non-homogeneous Poisson process(NHPP)model.In this paper,a new measurement process model of the pulsed radiation field produced by transient nuclear fusion is built based on the NHPP.A simulated measurement is performed using the model,and the current signal from the detector is obtained by simulation based on Poisson process thinning.The neutron time spectrum is reconstructed and is in good agreement with the theoretical value,with its maximum error of a characteristic parameter less than 2.3%.Verification experiments were carried out on a CPNG-6 device at the China Institute of Atomic Energy,with a detection system with a nanosecond response time.The experimental charge amplitude spectra are in good agreement with those obtained by the traditional counting mode,and the characteristic parameters of the time spectrum are in good agreement with the theoretical values.This shows that the current-mode-counting method is effective for the observation of transient nuclear fusion processes.
基金the support of the National Science Foundation of China(12372120,12172075)the Liaoning Revitalization Talents Program(XLYC2007027)Fundamental Research Funds for the Central Universities(DUT21RC(3)067).
文摘Negative Poisson’s ratio(NPR)metamaterials are attractive for their unique mechanical behaviors and potential applications in deformation control and energy absorption.However,when subjected to significant stretching,NPR metamaterials designed under small strain assumption may experience a rapid degradation in NPR performance.To address this issue,this study aims to design metamaterials maintaining a targeted NPR under large deformation by taking advantage of the geometry nonlinearity mechanism.A representative periodic unit cell is modeled considering geometry nonlinearity,and its topology is designed using a gradient-free method.The unit cell microstructural topologies are described with the material-field series-expansion(MFSE)method.The MFSE method assumes spatial correlation of the material distribution,which greatly reduces the number of required design variables.To conveniently design metamaterials with desired NPR under large deformation,we propose a two-stage gradient-free metamaterial topology optimization method,which fully takes advantage of the dimension reduction benefits of the MFSE method and the Kriging surrogate model technique.Initially,we use homogenization to find a preliminary NPR design under a small deformation assumption.In the second stage,we begin with this preliminary design and minimize deviations in NPR from a targeted value under large deformation.Using this strategy and solution technique,we successfully obtain a group of NPR metamaterials that can sustain different desired NPRs in the range of[−0.8,−0.1]under uniaxial stretching up to 20% strain.Furthermore,typical microstructure designs are fabricated and tested through experiments.The experimental results show good consistency with our numerical results,demonstrating the effectiveness of the present gradientfree NPR metamaterial design strategy.
基金The authors thank the Yayasan Universiti Teknologi PETRONAS(YUTP FRG Grant No.015LC0-428)at Universiti Teknologi PETRO-NAS for supporting this study.
文摘Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to specific data ranges with an average absolute percentage relative error(AAPRE)of more than 10%.The published gated recurrent unit(GRU)models do not consider trend analysis to show physical behaviors.In this study,we aim to develop a GRU model using trend analysis and three inputs for predicting n s based on a broad range of data,n s(value of 0.1627-0.4492),bulk formation density(RHOB)(0.315-2.994 g/mL),compressional time(DTc)(44.43-186.9 μs/ft),and shear time(DTs)(72.9-341.2μ s/ft).The GRU model was evaluated using different approaches,including statistical error an-alyses.The GRU model showed the proper trends,and the model data ranges were wider than previous ones.The GRU model has the largest correlation coefficient(R)of 0.967 and the lowest AAPRE,average percent relative error(APRE),root mean square error(RMSE),and standard deviation(SD)of 3.228%,1.054%,4.389,and 0.013,respectively,compared to other models.The GRU model has a high accuracy for the different datasets:training,validation,testing,and the whole datasets with R and AAPRE values were 0.981 and 2.601%,0.966 and 3.274%,0.967 and 3.228%,and 0.977 and 2.861%,respectively.The group error analyses of all inputs show that the GRU model has less than 5% AAPRE for all input ranges,which is superior to other models that have different AAPRE values of more than 10% at various ranges of inputs.