Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by ...Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations. The several Bayesian approaches are discussed and some limitations are founded. An improvement is put forward after limitations of Bayesian approaches available are analyzed and the improved approach is applied to assessment of some new weapon performance.展开更多
In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gau...In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.展开更多
New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aimin...New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.展开更多
Failure prediction plays an important role for many tasks such as optimal resource management in large-scale system. However, accurately failure number prediction of repairable large-scale long-running computing (RLL...Failure prediction plays an important role for many tasks such as optimal resource management in large-scale system. However, accurately failure number prediction of repairable large-scale long-running computing (RLLC) is a challenge because of the reparability and large-scale. To address the challenge, a general Bayesian serial revision prediction method based on Bootstrap approach and moving average approach is put forward, which can make an accurately prediction for the failure number. To demonstrate the performance gains of our method, extensive experiments on the data of Los Alamos National Laboratory (LANL) cluster is implemented, which is a typical RLLC system. And experimental results show that the prediction accuracy of our method is 80.2 %, and it is a greatly improvement with 4 % compared with some typical methods. Finally, the managerial implications of the models are discussed.展开更多
A Bayesian method for estimating human error probability(HEP) is presented.The main idea of the method is incorporating human performance data into the HEP estimation process.By integrating human performance data an...A Bayesian method for estimating human error probability(HEP) is presented.The main idea of the method is incorporating human performance data into the HEP estimation process.By integrating human performance data and prior information about human performance together,a more accurate and specific HEP estimation can be achieved.For the time-unrelated task without rigorous time restriction,the HEP estimated by the common-used human reliability analysis(HRA) methods or expert judgments is collected as the source of prior information.And for the time-related task with rigorous time restriction,the human error is expressed as non-response making.Therefore,HEP is the time curve of non-response probability(NRP).The prior information is collected from system safety and reliability specifications or by expert judgments.The(joint) posterior distribution of HEP or NRP-related parameter(s) is constructed after prior information has been collected.Based on the posterior distribution,the point or interval estimation of HEP/NRP is obtained.Two illustrative examples are introduced to demonstrate the practicality of the aforementioned approach.展开更多
In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural charact...In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined wfth the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.展开更多
The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because ...The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.展开更多
Soil-water characteristic curve (SWCC) is significant to estimate the site-specific unsaturated soil properties (such as unsaturated shear strength and coefficient of permeability) for geotechnical analyses involving ...Soil-water characteristic curve (SWCC) is significant to estimate the site-specific unsaturated soil properties (such as unsaturated shear strength and coefficient of permeability) for geotechnical analyses involving unsaturated soils. Determining SWCC can be achieved by fitting data points obtained according to the prescribed experimental scheme, which is specified by the number of measuring points and their corresponding values of the control variable. The number of measuring points is limited since direct measurement of SWCC is often costly and time-consuming. Based on the limited number of measuring points, the estimated SWCC is unavoidably associated with uncertainties, which depends on measurement data obtained from the prescribed experimental scheme. Therefore, it is essential to plan the experimental scheme so as to reduce the uncertainty in the estimated SWCC. This study presented a Bayesian approach, called OBEDO, for probabilistic experimental design optimization of measuring SWCC based on the prior knowledge and information of testing apparatus. The uncertainty in estimated SWCC is quantified and the optimal experimental scheme with the maximum expected utility is determined by Subset Simulation optimization (SSO) in candidate experimental scheme space. The proposed approach is illustrated using an experimental design example given prior knowledge and the information of testing apparatus and is verified based on a set of real loess SWCC data, which were used to generate random experimental schemes to mimic the arbitrary arrangement of measuring points during SWCC testing in practice. Results show that the arbitrary arrangement of measuring points of SWCC testing is hardly superior to the optimal scheme obtained from OBEDO in terms of the expected utility. The proposed OBEDO approach provides a rational tool to optimize the arrangement of measuring points of SWCC test so as to obtain SWCC measurement data with relatively high expected utility for uncertainty reduction.展开更多
In this study, we adopt an improved Bayesian approach based on free-knot B-spline bases to study the spatial and temporal distribution of the b-value. Synthetic tests show that the improved Bayesian approach has a sup...In this study, we adopt an improved Bayesian approach based on free-knot B-spline bases to study the spatial and temporal distribution of the b-value. Synthetic tests show that the improved Bayesian approach has a superior performance compared to the Bayesian approach as well as the widely used maximum likelihood estimation (MLE) method in fitting the real variation of b-values. We then apply the improved Bayesian approach to North China and find that the b-value has a clear relevance to seismicity. Temporal changes of b-values are also investigated in two specific areas of North China. We interpret sharp decreases in the b-values as useful messages in earthquake hazard analysis.展开更多
This study proposed a new analytical approach to identify the excessive comovement of two markets as contagion.This goal is achieved by linking latent-factor and single-equation error correction models and evaluating ...This study proposed a new analytical approach to identify the excessive comovement of two markets as contagion.This goal is achieved by linking latent-factor and single-equation error correction models and evaluating the breaks in the short-and long-term relationships and correlatedness in the linked model.The results demonstrated that a short-term relationship representing the market speed ratio between two markets plays a key role in contagion dynamics.When a long-term relationship or correlatedness is broken(comovement change)due to a break in the short-term relationship(market speed ratio),contagion is highly likely and should be formally declared.Bayesian posterior probabilities were calculated to determine the cause.Furthermore,this study applied this analytical Bayesian approach to empirically test the contagion effects of the U.S.stock market during the global financial crisis between 2007 and 2009 using 22 developed equity markets.展开更多
Quantifying the tool–tissue interaction forces in surgery can be utilized in the training of inexperienced surgeons,assist them better use surgical tools and avoid applying excessive pressures.The voltages read from ...Quantifying the tool–tissue interaction forces in surgery can be utilized in the training of inexperienced surgeons,assist them better use surgical tools and avoid applying excessive pressures.The voltages read from strain gauges are used to approximate the unknown values of implemented forces.To this objective,the force-voltage connection must be quantified in order to evaluate the interaction forces during surgery.The progress of appropriate statistical learning approaches to describe the link between the genuine force applied on the tissue and numerous outputs obtained from sensors installed on surgical equipment is a key problem.In this study,different probabilistic approaches are used to evaluate the realized force on tissue using voltages read from strain gauges,including bootstrapping,Bayesian regression,weighted least squares regression,and multi-level modelling.Estimates from the proposed models are more precise than the maximum likelihood and restricted maximum likelihood techniques.The suggested methodologies are proficient of assessing tool-tissue interface forces with an adequate level of accuracy.展开更多
By analyzing the shortage of reliability test design and thinking over the producer's risk and consumer's risk, the information fusion technology is used to set up a reliability test design model( RTDM). By an...By analyzing the shortage of reliability test design and thinking over the producer's risk and consumer's risk, the information fusion technology is used to set up a reliability test design model( RTDM). By analyzing the demands and constraint conditions of the RTDM and with applications of Bayesian approach and Monte Carlo method( MCM),this paper puts forward the exponential distributed subsystems and the information fusion technology among them. According to the posteriori risk criteria,formulas of producer's risk and consumer's risk were also inferred,and with the help of Matlab software,selection of the optimum test plan was solved. Finally,validity of the model had been proved by a test of series parallel system.展开更多
Coptis chinensis(Huanglian) is a commonly used traditional Chinese medicine(TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance...Coptis chinensis(Huanglian) is a commonly used traditional Chinese medicine(TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance liquid chromatography(RP-HPLC) method allowing the separation of six alkaloids in Huanglian was for the first time developed under the quality by design(Qb D) principles. First, five chromatographic parameters were identified to construct a Plackett-Burman experimental design. The critical resolution, analysis time, and peak width were responses modeled by multivariate linear regression. The results showed that the percentage of acetonitrile, concentration of sodium dodecyl sulfate, and concentration of potassium phosphate monobasic were statistically significant parameters(P < 0.05). Then, the Box-Behnken experimental design was applied to further evaluate the interactions between the three parameters on selected responses. Full quadratic models were built and used to establish the analytical design space. Moreover, the reliability of design space was estimated by the Bayesian posterior predictive distribution. The optimal separation was predicted at 40% acetonitrile, 1.7 g·m L-1of sodium dodecyl sulfate and 0.03 mol·m L-1 of potassium phosphate monobasic. Finally, the accuracy profile methodology was used to validate the established HPLC method. The results demonstrated that the Qb D concept could be efficiently used to develop a robust RP-HPLC analytical method for Huanglian.展开更多
This paper introduces some Bayesian optimal design methods for step-stress accelerated life test planning with one accelerating variable, when the acceleration model is linear in the accelerated variable or its functi...This paper introduces some Bayesian optimal design methods for step-stress accelerated life test planning with one accelerating variable, when the acceleration model is linear in the accelerated variable or its function, based on censored data from a log-location-scale distributions. In order to find the optimal plan,we propose different Monte Carlo simulation algorithms for different Bayesian optimal criteria. We present an example using the lognormal life distribution with Type-I censoring to illustrate the different Bayesian methods and to examine the effects of the prior distribution and sample size. By comparing the different Bayesian methods we suggest that when the data have large(small) sample size B1(τ)(B2(τ)) method is adopted. Finally, the Bayesian optimal plans are compared with the plan obtained by maximum likelihood method.展开更多
Bayesian adaptive randomization has attracted increasingly attention in the literature and has been implemented in many phase II clinical trials. Doubly adaptive biased coin design(DBCD) is a superior choice in respon...Bayesian adaptive randomization has attracted increasingly attention in the literature and has been implemented in many phase II clinical trials. Doubly adaptive biased coin design(DBCD) is a superior choice in response-adaptive designs owing to its promising properties. In this paper, we propose a randomized design by combining Bayesian adaptive randomization with doubly adaptive biased coin design. By selecting a fixed tuning parameter, the proposed randomization procedure can target an explicit allocation proportion, and assign more patients to the better treatment simultaneously. Moreover, the proposed randomization is efficient to detect treatment differences. We illustrate the proposed design by its applications to both discrete and continuous responses, and evaluate its operating features through simulation studies.展开更多
The Bayesian neural network approach has been employed to improve the nuclear magnetic moment predictions of odd-A nuclei.The Schmidt magnetic moment obtained from the extreme single-particle shell model makes large r...The Bayesian neural network approach has been employed to improve the nuclear magnetic moment predictions of odd-A nuclei.The Schmidt magnetic moment obtained from the extreme single-particle shell model makes large root-mean-square(rms)deviations from data,i.e.,0.949μN and 1.272μN for odd-neutron nuclei and odd-proton nuclei,respectively.By including the dependence of the nuclear spin and Schmidt magnetic moment,the machine-learning approach precisely describes the magnetic moments of odd-A uclei with rms deviations of 0.036μN for odd-neutron nuclei and 0.061μN for odd-proton nuclei.Furthermore,the evolution of magnetic moments along isotopic chains,including the staggering and sudden jump trend,which are difficult to describe using nuclear models,have been well reproduced by the Bayesian neural network(BNN)approach.The magnetic moments of doubly closed-shell±1 nuclei,for example,isoscalar and isovector magnetic moments,have been well studied and compared with the corresponding non-relativistic and relativistic calculations.展开更多
Background Leprosy is an infectious disease caused by Mycobacterium leprae and remains a source of preventable disability if left undetected.Case detection delay is an important epidemiological indicator for progress ...Background Leprosy is an infectious disease caused by Mycobacterium leprae and remains a source of preventable disability if left undetected.Case detection delay is an important epidemiological indicator for progress in interrupting transmission and preventing disability in a community.However,no standard method exists to effectively analyse and interpret this type of data.In this study,we aim to evaluate the characteristics of leprosy case detection delay data and select an appropriate model for the variability of detection delays based on the best fitting distribution type.Methods Two sets of leprosy case detection delay data were evaluated:a cohort of 181 patients from the post exposure prophylaxis for leprosy(PEP4LEP)study in high endemic districts of Ethiopia,Mozambique,and Tanzania;and self-reported delays from 87 individuals in 8 low endemic countries collected as part of a systematic literature review.Bayesian models were fit to each dataset to assess which probability distribution(log-normal,gamma or Weibull)best describes variation in observed case detection delays using leave-one-out cross-validation,and to estimate the effects of individual factors.Results For both datasets,detection delays were best described with a log-normal distribution combined with covariates age,sex and leprosy subtype[expected log predictive density(ELPD)for the joint model:-1123.9].Patients with multibacillary(MB)leprosy experienced longer delays compared to paucibacillary(PB)leprosy,with a relative difference of 1.57[95%Bayesian credible interval(BCI):1.14-2.15].Those in the PEP4LEP cohort had 1.51(95%BCI:1.08-2.13)times longer case detection delay compared to the self-reported patient delays in the systematic review.Conclusions The log-normal model presented here could be used to compare leprosy case detection delay datasets,including PEP4LEP where the primary outcome measure is reduction in case detection delay.We recommend the application of this modelling approach to test different probability distributions and covariate effects in studies with similar outcomes in the field of leprosy and other skin-NTDs.展开更多
The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existin...The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existing offline data.To evaluate pricing policies when offline data are available,the decision maker can either position herself at the time point when the offline data are already observed and viewed as deterministic,or at the time point when the offline data are not yet generated and viewed as stochastic.We write a framework to discuss how and why these two different positions are relevant to online policy evaluations,from a worst-case perspective and from a Bayesian perspective.We then use a simple online pricing setting with offline data to illustrate the constructions of optimal policies for these two approaches and discuss their differences,especially whether we can decompose the searching for the optimal policy into independent subproblems and optimize separately,and whether there exists a deterministic optimal policy.展开更多
Longitudinal data with ordinal outcomes commonly arise in clinical and social studies,where the purpose of interest is usually quantile curves rather than a simple reference range.In this paper we consider Bayesian no...Longitudinal data with ordinal outcomes commonly arise in clinical and social studies,where the purpose of interest is usually quantile curves rather than a simple reference range.In this paper we consider Bayesian nonlinear quantile regression for longitudinal ordinal data through a latent variable.An efficient Metropolis–Hastings within Gibbs algorithm was developed for model fitting.Simulation studies and a real data example are conducted to assess the performance of the proposed method.Results show that the proposed approach performs well.展开更多
The state-of-the-art technology in the field of vehicle automation will lead to a mixed traffic environment in the coming years,where connected and automated vehicles have to interact with human-driven vehicles.In thi...The state-of-the-art technology in the field of vehicle automation will lead to a mixed traffic environment in the coming years,where connected and automated vehicles have to interact with human-driven vehicles.In this context,it is necessary to have intention prediction models with the capability of forecasting how the traffic scenario is going to evolve with respect to the physical state of vehicles,the possible maneuvers and the interactions between traffic participants within the seconds to come.This article presents a Bayesian approach for vehicle intention forecasting,utilizing a game-theoretic framework in the form of a Mixed Strategy Nash Equilibrium(MSNE)as a prior estimate to model the reciprocal influence between traffic participants.The likelihood is then computed based on the Kullback-Leibler divergence.The game is modeled as a static nonzero-sum polymatrix game with individual preferences,a well known strategic game.Finding the MSNE for these games is in the PPAD∩PLS complexity class,with polynomial-time tractability.The approach shows good results in simulations in the long term horizon(10s),with its computational complexity allowing for online applications.展开更多
文摘Abundant test data are required in assessment of weapon performance. When weapon test data are insufficient, Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations. The several Bayesian approaches are discussed and some limitations are founded. An improvement is put forward after limitations of Bayesian approaches available are analyzed and the improved approach is applied to assessment of some new weapon performance.
基金supported by the Shanxi Provincial Foundation for Returned Overseas Scholars (No. 20220037)Natural Science Foundation of Shanxi Province (No. 20210302123085)Discipline Construction Project of Yuncheng University
文摘In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.
基金supported by Sustentation Program of National Ministries and Commissions of China (Grant No. 51319030302 and Grant No. 9140A19030506KG0166)
文摘New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
基金supported by the National Natural Science Foundationof China (60701006 60804054 71071158)
文摘Failure prediction plays an important role for many tasks such as optimal resource management in large-scale system. However, accurately failure number prediction of repairable large-scale long-running computing (RLLC) is a challenge because of the reparability and large-scale. To address the challenge, a general Bayesian serial revision prediction method based on Bootstrap approach and moving average approach is put forward, which can make an accurately prediction for the failure number. To demonstrate the performance gains of our method, extensive experiments on the data of Los Alamos National Laboratory (LANL) cluster is implemented, which is a typical RLLC system. And experimental results show that the prediction accuracy of our method is 80.2 %, and it is a greatly improvement with 4 % compared with some typical methods. Finally, the managerial implications of the models are discussed.
基金supported by the Specialized Research Fund for the Doctoral Program of Higher Education(20114307120032)the National Natural Science Foundation of China(71201167)
文摘A Bayesian method for estimating human error probability(HEP) is presented.The main idea of the method is incorporating human performance data into the HEP estimation process.By integrating human performance data and prior information about human performance together,a more accurate and specific HEP estimation can be achieved.For the time-unrelated task without rigorous time restriction,the HEP estimated by the common-used human reliability analysis(HRA) methods or expert judgments is collected as the source of prior information.And for the time-related task with rigorous time restriction,the human error is expressed as non-response making.Therefore,HEP is the time curve of non-response probability(NRP).The prior information is collected from system safety and reliability specifications or by expert judgments.The(joint) posterior distribution of HEP or NRP-related parameter(s) is constructed after prior information has been collected.Based on the posterior distribution,the point or interval estimation of HEP/NRP is obtained.Two illustrative examples are introduced to demonstrate the practicality of the aforementioned approach.
基金supported by the National Science and Technology Support Program of China (Program for the Eleventh Five-Year Plan, Grant No. 2006BAC14B03 and 2006BAC05B03)the National Natural Science Foundation of China (Grant No. 50679043)
文摘In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined wfth the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.
文摘The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.
文摘Soil-water characteristic curve (SWCC) is significant to estimate the site-specific unsaturated soil properties (such as unsaturated shear strength and coefficient of permeability) for geotechnical analyses involving unsaturated soils. Determining SWCC can be achieved by fitting data points obtained according to the prescribed experimental scheme, which is specified by the number of measuring points and their corresponding values of the control variable. The number of measuring points is limited since direct measurement of SWCC is often costly and time-consuming. Based on the limited number of measuring points, the estimated SWCC is unavoidably associated with uncertainties, which depends on measurement data obtained from the prescribed experimental scheme. Therefore, it is essential to plan the experimental scheme so as to reduce the uncertainty in the estimated SWCC. This study presented a Bayesian approach, called OBEDO, for probabilistic experimental design optimization of measuring SWCC based on the prior knowledge and information of testing apparatus. The uncertainty in estimated SWCC is quantified and the optimal experimental scheme with the maximum expected utility is determined by Subset Simulation optimization (SSO) in candidate experimental scheme space. The proposed approach is illustrated using an experimental design example given prior knowledge and the information of testing apparatus and is verified based on a set of real loess SWCC data, which were used to generate random experimental schemes to mimic the arbitrary arrangement of measuring points during SWCC testing in practice. Results show that the arbitrary arrangement of measuring points of SWCC testing is hardly superior to the optimal scheme obtained from OBEDO in terms of the expected utility. The proposed OBEDO approach provides a rational tool to optimize the arrangement of measuring points of SWCC test so as to obtain SWCC measurement data with relatively high expected utility for uncertainty reduction.
基金jointly funded by the National Natural Science Foundation of China (Grant No.41274052)the Seismological Research Project of China (Grant No.201208009)financially supported by Peking University President’s Research Funding for undergraduate students (2012–2013)
文摘In this study, we adopt an improved Bayesian approach based on free-knot B-spline bases to study the spatial and temporal distribution of the b-value. Synthetic tests show that the improved Bayesian approach has a superior performance compared to the Bayesian approach as well as the widely used maximum likelihood estimation (MLE) method in fitting the real variation of b-values. We then apply the improved Bayesian approach to North China and find that the b-value has a clear relevance to seismicity. Temporal changes of b-values are also investigated in two specific areas of North China. We interpret sharp decreases in the b-values as useful messages in earthquake hazard analysis.
基金HS Lee’s supported by Sejong University.TY Kim’s work was supported by a grant from the National Research Foundation of Korea(NRF-2019R1F1A1060152)。
文摘This study proposed a new analytical approach to identify the excessive comovement of two markets as contagion.This goal is achieved by linking latent-factor and single-equation error correction models and evaluating the breaks in the short-and long-term relationships and correlatedness in the linked model.The results demonstrated that a short-term relationship representing the market speed ratio between two markets plays a key role in contagion dynamics.When a long-term relationship or correlatedness is broken(comovement change)due to a break in the short-term relationship(market speed ratio),contagion is highly likely and should be formally declared.Bayesian posterior probabilities were calculated to determine the cause.Furthermore,this study applied this analytical Bayesian approach to empirically test the contagion effects of the U.S.stock market during the global financial crisis between 2007 and 2009 using 22 developed equity markets.
文摘Quantifying the tool–tissue interaction forces in surgery can be utilized in the training of inexperienced surgeons,assist them better use surgical tools and avoid applying excessive pressures.The voltages read from strain gauges are used to approximate the unknown values of implemented forces.To this objective,the force-voltage connection must be quantified in order to evaluate the interaction forces during surgery.The progress of appropriate statistical learning approaches to describe the link between the genuine force applied on the tissue and numerous outputs obtained from sensors installed on surgical equipment is a key problem.In this study,different probabilistic approaches are used to evaluate the realized force on tissue using voltages read from strain gauges,including bootstrapping,Bayesian regression,weighted least squares regression,and multi-level modelling.Estimates from the proposed models are more precise than the maximum likelihood and restricted maximum likelihood techniques.The suggested methodologies are proficient of assessing tool-tissue interface forces with an adequate level of accuracy.
基金National Natural Science Foundation of China(No.70971133)
文摘By analyzing the shortage of reliability test design and thinking over the producer's risk and consumer's risk, the information fusion technology is used to set up a reliability test design model( RTDM). By analyzing the demands and constraint conditions of the RTDM and with applications of Bayesian approach and Monte Carlo method( MCM),this paper puts forward the exponential distributed subsystems and the information fusion technology among them. According to the posteriori risk criteria,formulas of producer's risk and consumer's risk were also inferred,and with the help of Matlab software,selection of the optimum test plan was solved. Finally,validity of the model had been proved by a test of series parallel system.
基金supported by National Natural Science Foundation of China(No.81403112)Beijing Natural Science Foundation(No.7154217)+1 种基金Scientific Research Program of Beijing University of Chinese Medicine(No.2015-JYB-XS104)Special Program for Beijing Key Laboratory of Chinese Medicine Manufacturing Process Control and Quality Evaluation(No.Z151100001615065)
文摘Coptis chinensis(Huanglian) is a commonly used traditional Chinese medicine(TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance liquid chromatography(RP-HPLC) method allowing the separation of six alkaloids in Huanglian was for the first time developed under the quality by design(Qb D) principles. First, five chromatographic parameters were identified to construct a Plackett-Burman experimental design. The critical resolution, analysis time, and peak width were responses modeled by multivariate linear regression. The results showed that the percentage of acetonitrile, concentration of sodium dodecyl sulfate, and concentration of potassium phosphate monobasic were statistically significant parameters(P < 0.05). Then, the Box-Behnken experimental design was applied to further evaluate the interactions between the three parameters on selected responses. Full quadratic models were built and used to establish the analytical design space. Moreover, the reliability of design space was estimated by the Bayesian posterior predictive distribution. The optimal separation was predicted at 40% acetonitrile, 1.7 g·m L-1of sodium dodecyl sulfate and 0.03 mol·m L-1 of potassium phosphate monobasic. Finally, the accuracy profile methodology was used to validate the established HPLC method. The results demonstrated that the Qb D concept could be efficiently used to develop a robust RP-HPLC analytical method for Huanglian.
基金Supported by the Natural Science Foundation of China(11401341,11271136 and 81530086)111 Project(B14019)+2 种基金Natural Science Foundation of Fujian Province,China(2015J05014,2016J01681 and 2017N0029)Scientific Research Training Program of Fujian Province University for Distinguished Young Scholar(2015)New Century Excellent Talents Support Project of Fujian Province University([2016]23)
文摘This paper introduces some Bayesian optimal design methods for step-stress accelerated life test planning with one accelerating variable, when the acceleration model is linear in the accelerated variable or its function, based on censored data from a log-location-scale distributions. In order to find the optimal plan,we propose different Monte Carlo simulation algorithms for different Bayesian optimal criteria. We present an example using the lognormal life distribution with Type-I censoring to illustrate the different Bayesian methods and to examine the effects of the prior distribution and sample size. By comparing the different Bayesian methods we suggest that when the data have large(small) sample size B1(τ)(B2(τ)) method is adopted. Finally, the Bayesian optimal plans are compared with the plan obtained by maximum likelihood method.
基金supported by National Natural Science Foundation of China (Grant No. 11371366)Doctoral Research Fund of Henan Polytechnic University (Grant No. 672103/001/147)
文摘Bayesian adaptive randomization has attracted increasingly attention in the literature and has been implemented in many phase II clinical trials. Doubly adaptive biased coin design(DBCD) is a superior choice in response-adaptive designs owing to its promising properties. In this paper, we propose a randomized design by combining Bayesian adaptive randomization with doubly adaptive biased coin design. By selecting a fixed tuning parameter, the proposed randomization procedure can target an explicit allocation proportion, and assign more patients to the better treatment simultaneously. Moreover, the proposed randomization is efficient to detect treatment differences. We illustrate the proposed design by its applications to both discrete and continuous responses, and evaluate its operating features through simulation studies.
基金Supported by the National Natural Science Foundation of China(11675063,11875070,11205068)the Open fund for Discipline Construction,Institute of Physical Science and Information Technology,Anhui University。
文摘The Bayesian neural network approach has been employed to improve the nuclear magnetic moment predictions of odd-A nuclei.The Schmidt magnetic moment obtained from the extreme single-particle shell model makes large root-mean-square(rms)deviations from data,i.e.,0.949μN and 1.272μN for odd-neutron nuclei and odd-proton nuclei,respectively.By including the dependence of the nuclear spin and Schmidt magnetic moment,the machine-learning approach precisely describes the magnetic moments of odd-A uclei with rms deviations of 0.036μN for odd-neutron nuclei and 0.061μN for odd-proton nuclei.Furthermore,the evolution of magnetic moments along isotopic chains,including the staggering and sudden jump trend,which are difficult to describe using nuclear models,have been well reproduced by the Bayesian neural network(BNN)approach.The magnetic moments of doubly closed-shell±1 nuclei,for example,isoscalar and isovector magnetic moments,have been well studied and compared with the corresponding non-relativistic and relativistic calculations.
基金the European Union awarded to NLR/LM(grant number RIA2017NIM-1839-PEP-4LEP),and the Leprosy Research Initiative(LRIwww.lepro syres earch.org)awarded to NLR/LM(grant number 707.19.58.).
文摘Background Leprosy is an infectious disease caused by Mycobacterium leprae and remains a source of preventable disability if left undetected.Case detection delay is an important epidemiological indicator for progress in interrupting transmission and preventing disability in a community.However,no standard method exists to effectively analyse and interpret this type of data.In this study,we aim to evaluate the characteristics of leprosy case detection delay data and select an appropriate model for the variability of detection delays based on the best fitting distribution type.Methods Two sets of leprosy case detection delay data were evaluated:a cohort of 181 patients from the post exposure prophylaxis for leprosy(PEP4LEP)study in high endemic districts of Ethiopia,Mozambique,and Tanzania;and self-reported delays from 87 individuals in 8 low endemic countries collected as part of a systematic literature review.Bayesian models were fit to each dataset to assess which probability distribution(log-normal,gamma or Weibull)best describes variation in observed case detection delays using leave-one-out cross-validation,and to estimate the effects of individual factors.Results For both datasets,detection delays were best described with a log-normal distribution combined with covariates age,sex and leprosy subtype[expected log predictive density(ELPD)for the joint model:-1123.9].Patients with multibacillary(MB)leprosy experienced longer delays compared to paucibacillary(PB)leprosy,with a relative difference of 1.57[95%Bayesian credible interval(BCI):1.14-2.15].Those in the PEP4LEP cohort had 1.51(95%BCI:1.08-2.13)times longer case detection delay compared to the self-reported patient delays in the systematic review.Conclusions The log-normal model presented here could be used to compare leprosy case detection delay datasets,including PEP4LEP where the primary outcome measure is reduction in case detection delay.We recommend the application of this modelling approach to test different probability distributions and covariate effects in studies with similar outcomes in the field of leprosy and other skin-NTDs.
文摘The problems of online pricing with offline data,among other similar online decision making with offline data problems,aim at designing and evaluating online pricing policies in presence of a certain amount of existing offline data.To evaluate pricing policies when offline data are available,the decision maker can either position herself at the time point when the offline data are already observed and viewed as deterministic,or at the time point when the offline data are not yet generated and viewed as stochastic.We write a framework to discuss how and why these two different positions are relevant to online policy evaluations,from a worst-case perspective and from a Bayesian perspective.We then use a simple online pricing setting with offline data to illustrate the constructions of optimal policies for these two approaches and discuss their differences,especially whether we can decompose the searching for the optimal policy into independent subproblems and optimize separately,and whether there exists a deterministic optimal policy.
基金supported in part by the National Key Research and Development Plan(No.2016YFC0800100)National Natural Science Foundation of China Grant 11671374 and 71631006.
文摘Longitudinal data with ordinal outcomes commonly arise in clinical and social studies,where the purpose of interest is usually quantile curves rather than a simple reference range.In this paper we consider Bayesian nonlinear quantile regression for longitudinal ordinal data through a latent variable.An efficient Metropolis–Hastings within Gibbs algorithm was developed for model fitting.Simulation studies and a real data example are conducted to assess the performance of the proposed method.Results show that the proposed approach performs well.
文摘The state-of-the-art technology in the field of vehicle automation will lead to a mixed traffic environment in the coming years,where connected and automated vehicles have to interact with human-driven vehicles.In this context,it is necessary to have intention prediction models with the capability of forecasting how the traffic scenario is going to evolve with respect to the physical state of vehicles,the possible maneuvers and the interactions between traffic participants within the seconds to come.This article presents a Bayesian approach for vehicle intention forecasting,utilizing a game-theoretic framework in the form of a Mixed Strategy Nash Equilibrium(MSNE)as a prior estimate to model the reciprocal influence between traffic participants.The likelihood is then computed based on the Kullback-Leibler divergence.The game is modeled as a static nonzero-sum polymatrix game with individual preferences,a well known strategic game.Finding the MSNE for these games is in the PPAD∩PLS complexity class,with polynomial-time tractability.The approach shows good results in simulations in the long term horizon(10s),with its computational complexity allowing for online applications.