Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating du...Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.展开更多
An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to rec...An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.展开更多
In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gau...In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.展开更多
Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research objec...Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.展开更多
The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model ...The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model parameters from the perspective of random variables and describe the general form of the parameter distribution inference problem.Under this framework,we propose an ensemble Bayesian method by introducing Bayesian inference and the Markov chain Monte Carlo(MCMC)method.Experiments on a finite cylindrical reactor and a 2D IAEA benchmark problem show that the proposed method converges quickly and can estimate parameters effectively,even for several correlated parameters simultaneously.Our experiments include cases of engineering software calls,demonstrating that the method can be applied to engineering,such as nuclear reactor engineering.展开更多
In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Ha...In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.展开更多
Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under qui...Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under quite general conditions, guarantee Bayesian optimality of highest posterior probability sets. We focus on three specific families of monotone losses, namely the linear, the exponential and the rational losses whose difference consists in the way the sizes of the sets are penalized. Within the standard yet important set-up of a normal model we propose: 1) an optimality analysis, to compare the solutions yielded by the alternative classes of losses;2) a regret analysis, to evaluate the additional loss of standard non-optimal intervals of fixed credibility. The article uses an application to a clinical trial as an illustrative example.展开更多
The genus Silurus,an important group of catfish,exhibits heterogeneous distribution in Eurasian freshwater systems.This group includes economically important and endangered species,thereby attracting considerable scie...The genus Silurus,an important group of catfish,exhibits heterogeneous distribution in Eurasian freshwater systems.This group includes economically important and endangered species,thereby attracting considerable scientific interest.Despite this interest,the lack of a comprehensive phylogenetic framework impedes our understanding of the mechanisms underlying the extensive diversity found within this genus.Herein,we analyzed 89 newly sequenced and 20 previously published mitochondrial genomes(mitogenomes)from 13 morphological species to reconstruct the phylogenetic relationships,biogeographic history,and species diversity of Silurus.Our phylogenetic reconstructions identified eight clades,supported by both maximum-likelihood and Bayesian inference.Sequence-based species delimitation analyses yielded multiple molecular operational taxonomic units(MOTUs)in several taxa,including the Silurus asotus complex(four MOTUs)and Silurus microdorsalis(two MOTUs),suggesting that species diversity is underestimated in the genus.A reconstructed time-calibrated tree of Silurus species provided an age estimate of the most recent common ancestor of approximately 37.61 million years ago(Ma),with divergences among clades within the genus occurring between 11.56 Ma and 29.44 Ma,and divergences among MOTUs within species occurring between 3.71 Ma and 11.56 Ma.Biogeographic reconstructions suggested that the ancestral area for the genus likely encompassed China and the Korean Peninsula,with multiple inferred dispersal events to Europe and Central and Western Asia between 21.78 Ma and 26.67 Ma and to Japan between 2.51 Ma and 18.42 Ma.Key factors such as the Eocene-Oligocene extinction event,onset and intensification of the monsoon system,and glacial cycles associated with sea-level fluctuations have likely played significant roles in shaping the evolutionary history of the genus Silurus.展开更多
A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied...A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied.In this research, using a progressive Type-II censored, various inferences of the MOL model parameters oflife are introduced. Utilizing the maximum likelihood method as a classical approach, the estimators of themodel parameters and various reliability measures are investigated. Against both symmetric and asymmetric lossfunctions, the Bayesian estimates are obtained using the Markov Chain Monte Carlo (MCMC) technique with theassumption of independent gamma priors. From the Fisher information data and the simulatedMarkovian chains,the approximate asymptotic interval and the highest posterior density interval, respectively, of each unknownparameter are calculated. Via an extensive simulated study, the usefulness of the various suggested strategies isassessedwith respect to some evaluationmetrics such as mean squared errors, mean relative absolute biases, averageconfidence lengths, and coverage percentages. Comparing the Bayesian estimations based on the asymmetric lossfunction to the traditional technique or the symmetric loss function-based Bayesian estimations, the analysisdemonstrates that asymmetric loss function-based Bayesian estimations are preferred. Finally, two data sets,representing vinyl chloride and repairable mechanical equipment items, have been investigated to support theapproaches proposed and show the superiority of the proposed model compared to the other fourteen lifetimemodels.展开更多
A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown m...A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown means and covariances in the constellation plane, and a clustering method is proposed to estimate the probability density of the MPSK signals. The method is based on the nonparametric Bayesian inference, which introduces the Dirichlet process as the prior probability of the mixture coefficient, and applies a normal inverse Wishart (NIW) distribution as the prior probability of the unknown mean and covariance. Then, according to the received signals, the parameters are adjusted by the Monte Carlo Markov chain (MCMC) random sampling algorithm. By iterations, the density estimation of the MPSK signals can be estimated. Simulation results show that the correct recognition ratio of 2/4/8PSK is greater than 95% under the condition that SNR 〉5 dB and 1 600 symbols are used in this method.展开更多
In the present work, we are interested in studying the joint distributions of pairs of the monthly maxima of the pollutants used by the environmental authorities in Mexico City to classify the air quality in the metro...In the present work, we are interested in studying the joint distributions of pairs of the monthly maxima of the pollutants used by the environmental authorities in Mexico City to classify the air quality in the metropolitan area. In order to obtain the joint distributions a copula will be considered. Since we are analyzing the monthly maxima, the extreme value distributions of Weibull and Fréchet are taken into account. Using these two distributions as marginal distributions in the copula a Bayesian inference was made in order to estimate the parameters of both distributions and also the association parameters appearing in the copula model. The pollutants taken into account are ozone, nitrogen dioxide, sulphur dioxide, carbon monoxide, and particulate matter with diameters smaller than 10 and 2.5 microns obtained from the Mexico City monitoring network. The estimation was performed by taking samples of the parameters generated through a Markov chain Monte Carlo algorithm implemented using the software OpenBugs. Once the algorithm is implemented it is applied to the pairs of pollutants where one of the coordinates of the pair is ozone and the other varies on the set of the remaining pollutants. Depending on the pollutant and the region where they were collected, different results were obtained. Hence, in some cases we have that the best model is that where we have a Fréchet distribution as the marginal distribution for the measurements of both pollutants and in others the most suitable model is the one assuming a Fréchet for ozone and a Weibull for the other pollutant. Results show that, in the present case, the estimated association parameter is a good representation to the correlation parameters between the pair of pollutants analyzed. Additionally, it is a straightforward task to obtain these correlation parameters from the corresponding association parameters.展开更多
To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov pr...To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov process, so that the air combat situation can be effectively calculated via Bayesian inference theory. According to the situation assessment result,adaptively adjusts the weights of maneuver decision factors, which makes the objective function more reasonable and ensures the superiority situation for UCAV. As the air combat game is characterized by highly dynamic and a significant amount of uncertainty,to enhance the robustness and effectiveness of maneuver decision results, fuzzy logic is used to build the functions of four maneuver decision factors. Accuracy prediction of opponent aircraft is also essential to ensure making a good decision; therefore, a prediction model of opponent aircraft is designed based on the elementary maneuver method. Finally, the moving horizon optimization strategy is used to effectively model the whole air combat maneuver decision process. Various simulations are performed on typical scenario test and close-in dogfight, the results sufficiently demonstrate the superiority of the designed maneuver decision method.展开更多
To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and intern...To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and internal node state, we find the deficiency of only using port information. Then, we define the gate level number computing method and the concept of slice, and propose using slice analysis to distill switching density as coefficients in a special circuit stage and participate in Bayesian inference with port information. Experiments show that this method can reduce the power-per-cycle estimation error by 21.9% and the root mean square error by 25.0% compared with the original model, and maintain a 700 + speedup compared with the existing gate-level power analysis technique.展开更多
Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to ...Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.展开更多
In order to carry out the genetic improvement of turbot upper thermal tolerance, it is necessary to estimate the genetic parameters of UTT(upper thermal tolerance) and growth-related traits. The objective of this st...In order to carry out the genetic improvement of turbot upper thermal tolerance, it is necessary to estimate the genetic parameters of UTT(upper thermal tolerance) and growth-related traits. The objective of this study was to estimate genetic parameters for BW(body weight) and UTT in a two-generational turbot(Scophthalmus maximus L.) pedigree derived from four imported turbot stocks(England, France, Denmark and Norway). A total of 42 families including 20 families from G1 generation and 22 families from G2 generation were used to test upper thermal tolerance(40–50 animals per family) in this study and the body weight of individuals were measured. The heritability of BW and UTT and the correlation between these two traits were estimated based on an individual animal model using Bayesian method based on two types of animal models with and without maternal effects.These results showed that the heritabilities for BW and UTT and phenotypic and genetic correlations between the two traits estimated from model without maternal effects were 0.239±0.141, 0.111±0.080, 0.075±0.026 and–0.019±0.011, respectively. The corresponding values from model with maternal effects were 0.203±0.115,0.055±0.026, 0.047±0.034 and –0.024±0.028, respectively. The maternal effects of BW and UTT were 0.050±0.017 and 0.013±0.004, respectively. The maternal effects had a certain influence on the genetic evaluation of the two traits. The findings of this paper provided the necessary background to determine the best selection strategy to be adopted in the genetic improvement program.展开更多
Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference metho...Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.展开更多
Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been st...Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.展开更多
Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock enginee...Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.展开更多
We present two approaches to system identification, i.e. the identification of partial differentialequations (PDEs) from measurement data. The first is a regression-based variational systemidentification procedure tha...We present two approaches to system identification, i.e. the identification of partial differentialequations (PDEs) from measurement data. The first is a regression-based variational systemidentification procedure that is advantageous in not requiring repeated forward model solves andhas good scalability to large number of differential operators. However it has strict data typerequirements needing the ability to directly represent the operators through the available data.The second is a Bayesian inference framework highly valuable for providing uncertaintyquantification, and flexible for accommodating sparse and noisy data that may also be indirectquantities of interest. However, it also requires repeated forward solutions of the PDE modelswhich is expensive and hinders scalability. We provide illustrations of results on a model problemfor pattern formation dynamics, and discuss merits of the presented methods.展开更多
Due to the simplicity and flexibility of the power law process,it is widely used to model the failures of repairable systems.Although statistical inference on the parameters of the power law process has been well deve...Due to the simplicity and flexibility of the power law process,it is widely used to model the failures of repairable systems.Although statistical inference on the parameters of the power law process has been well developed,numerous studies largely depend on complete failure data.A few methods on incomplete data are reported to process such data,but they are limited to their specific cases,especially to that where missing data occur at the early stage of the failures.No framework to handle generic scenarios is available.To overcome this problem,from the point of view of order statistics,the statistical inference of the power law process with incomplete data is established in this paper.The theoretical derivation is carried out and the case studies demonstrate and verify the proposed method.Order statistics offer an alternative to the statistical inference of the power law process with incomplete data as they can reformulate current studies on the left censored failure data and interval censored data in a unified framework.The results show that the proposed method has more flexibility and more applicability.展开更多
文摘Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.
基金supported by the National MCF Energy R&D Program of China (Nos. 2018 YFE0301105, 2022YFE03010002 and 2018YFE0302100)the National Key R&D Program of China (Nos. 2022YFE03070004 and 2022YFE03070000)National Natural Science Foundation of China (Nos. 12205195, 12075155 and 11975277)
文摘An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.
基金supported by the Shanxi Provincial Foundation for Returned Overseas Scholars (No. 20220037)Natural Science Foundation of Shanxi Province (No. 20210302123085)Discipline Construction Project of Yuncheng University
文摘In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.
基金the Young Potential Program of Shanghai Institute of Applied Physics,Chinese Academy of Sciences(No.E0553101).
文摘Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.
基金partially sponsored by the Natural Science Foundation of Shanghai(No.23ZR1429300)the Innovation Fund of CNNC(Lingchuang Fund)。
文摘The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model parameters from the perspective of random variables and describe the general form of the parameter distribution inference problem.Under this framework,we propose an ensemble Bayesian method by introducing Bayesian inference and the Markov chain Monte Carlo(MCMC)method.Experiments on a finite cylindrical reactor and a 2D IAEA benchmark problem show that the proposed method converges quickly and can estimate parameters effectively,even for several correlated parameters simultaneously.Our experiments include cases of engineering software calls,demonstrating that the method can be applied to engineering,such as nuclear reactor engineering.
基金Supported by National Natural Science Foundation of China (11922514)。
文摘In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.
文摘Decision-theoretic interval estimation requires the use of loss functions that, typically, take into account the size and the coverage of the sets. We here consider the class of monotone loss functions that, under quite general conditions, guarantee Bayesian optimality of highest posterior probability sets. We focus on three specific families of monotone losses, namely the linear, the exponential and the rational losses whose difference consists in the way the sizes of the sets are penalized. Within the standard yet important set-up of a normal model we propose: 1) an optimality analysis, to compare the solutions yielded by the alternative classes of losses;2) a regret analysis, to evaluate the additional loss of standard non-optimal intervals of fixed credibility. The article uses an application to a clinical trial as an illustrative example.
基金National Natural Science Foundation of China(32000306)Project of Innovation Team of Survey and Assessment of the Pearl River Fishery Resources(2023TD-10)Natural Science Foundation of Shaanxi Province(2023-JC-YB-325)。
文摘The genus Silurus,an important group of catfish,exhibits heterogeneous distribution in Eurasian freshwater systems.This group includes economically important and endangered species,thereby attracting considerable scientific interest.Despite this interest,the lack of a comprehensive phylogenetic framework impedes our understanding of the mechanisms underlying the extensive diversity found within this genus.Herein,we analyzed 89 newly sequenced and 20 previously published mitochondrial genomes(mitogenomes)from 13 morphological species to reconstruct the phylogenetic relationships,biogeographic history,and species diversity of Silurus.Our phylogenetic reconstructions identified eight clades,supported by both maximum-likelihood and Bayesian inference.Sequence-based species delimitation analyses yielded multiple molecular operational taxonomic units(MOTUs)in several taxa,including the Silurus asotus complex(four MOTUs)and Silurus microdorsalis(two MOTUs),suggesting that species diversity is underestimated in the genus.A reconstructed time-calibrated tree of Silurus species provided an age estimate of the most recent common ancestor of approximately 37.61 million years ago(Ma),with divergences among clades within the genus occurring between 11.56 Ma and 29.44 Ma,and divergences among MOTUs within species occurring between 3.71 Ma and 11.56 Ma.Biogeographic reconstructions suggested that the ancestral area for the genus likely encompassed China and the Korean Peninsula,with multiple inferred dispersal events to Europe and Central and Western Asia between 21.78 Ma and 26.67 Ma and to Japan between 2.51 Ma and 18.42 Ma.Key factors such as the Eocene-Oligocene extinction event,onset and intensification of the monsoon system,and glacial cycles associated with sea-level fluctuations have likely played significant roles in shaping the evolutionary history of the genus Silurus.
文摘A novel extended Lindley lifetime model that exhibits unimodal or decreasing density shapes as well as increasing,bathtub or unimodal-then-bathtub failure rates, named the Marshall-Olkin-Lindley (MOL) model is studied.In this research, using a progressive Type-II censored, various inferences of the MOL model parameters oflife are introduced. Utilizing the maximum likelihood method as a classical approach, the estimators of themodel parameters and various reliability measures are investigated. Against both symmetric and asymmetric lossfunctions, the Bayesian estimates are obtained using the Markov Chain Monte Carlo (MCMC) technique with theassumption of independent gamma priors. From the Fisher information data and the simulatedMarkovian chains,the approximate asymptotic interval and the highest posterior density interval, respectively, of each unknownparameter are calculated. Via an extensive simulated study, the usefulness of the various suggested strategies isassessedwith respect to some evaluationmetrics such as mean squared errors, mean relative absolute biases, averageconfidence lengths, and coverage percentages. Comparing the Bayesian estimations based on the asymmetric lossfunction to the traditional technique or the symmetric loss function-based Bayesian estimations, the analysisdemonstrates that asymmetric loss function-based Bayesian estimations are preferred. Finally, two data sets,representing vinyl chloride and repairable mechanical equipment items, have been investigated to support theapproaches proposed and show the superiority of the proposed model compared to the other fourteen lifetimemodels.
基金Cultivation Fund of the Key Scientific and Technical Innovation Project of Ministry of Education of China(No.3104001014)
文摘A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown means and covariances in the constellation plane, and a clustering method is proposed to estimate the probability density of the MPSK signals. The method is based on the nonparametric Bayesian inference, which introduces the Dirichlet process as the prior probability of the mixture coefficient, and applies a normal inverse Wishart (NIW) distribution as the prior probability of the unknown mean and covariance. Then, according to the received signals, the parameters are adjusted by the Monte Carlo Markov chain (MCMC) random sampling algorithm. By iterations, the density estimation of the MPSK signals can be estimated. Simulation results show that the correct recognition ratio of 2/4/8PSK is greater than 95% under the condition that SNR 〉5 dB and 1 600 symbols are used in this method.
文摘In the present work, we are interested in studying the joint distributions of pairs of the monthly maxima of the pollutants used by the environmental authorities in Mexico City to classify the air quality in the metropolitan area. In order to obtain the joint distributions a copula will be considered. Since we are analyzing the monthly maxima, the extreme value distributions of Weibull and Fréchet are taken into account. Using these two distributions as marginal distributions in the copula a Bayesian inference was made in order to estimate the parameters of both distributions and also the association parameters appearing in the copula model. The pollutants taken into account are ozone, nitrogen dioxide, sulphur dioxide, carbon monoxide, and particulate matter with diameters smaller than 10 and 2.5 microns obtained from the Mexico City monitoring network. The estimation was performed by taking samples of the parameters generated through a Markov chain Monte Carlo algorithm implemented using the software OpenBugs. Once the algorithm is implemented it is applied to the pairs of pollutants where one of the coordinates of the pair is ozone and the other varies on the set of the remaining pollutants. Depending on the pollutant and the region where they were collected, different results were obtained. Hence, in some cases we have that the best model is that where we have a Fréchet distribution as the marginal distribution for the measurements of both pollutants and in others the most suitable model is the one assuming a Fréchet for ozone and a Weibull for the other pollutant. Results show that, in the present case, the estimated association parameter is a good representation to the correlation parameters between the pair of pollutants analyzed. Additionally, it is a straightforward task to obtain these correlation parameters from the corresponding association parameters.
基金supported by the National Natural Science Foundation of China(61601505)the Aeronautical Science Foundation of China(20155196022)the Shaanxi Natural Science Foundation of China(2016JQ6050)
文摘To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov process, so that the air combat situation can be effectively calculated via Bayesian inference theory. According to the situation assessment result,adaptively adjusts the weights of maneuver decision factors, which makes the objective function more reasonable and ensures the superiority situation for UCAV. As the air combat game is characterized by highly dynamic and a significant amount of uncertainty,to enhance the robustness and effectiveness of maneuver decision results, fuzzy logic is used to build the functions of four maneuver decision factors. Accuracy prediction of opponent aircraft is also essential to ensure making a good decision; therefore, a prediction model of opponent aircraft is designed based on the elementary maneuver method. Finally, the moving horizon optimization strategy is used to effectively model the whole air combat maneuver decision process. Various simulations are performed on typical scenario test and close-in dogfight, the results sufficiently demonstrate the superiority of the designed maneuver decision method.
文摘To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and internal node state, we find the deficiency of only using port information. Then, we define the gate level number computing method and the concept of slice, and propose using slice analysis to distill switching density as coefficients in a special circuit stage and participate in Bayesian inference with port information. Experiments show that this method can reduce the power-per-cycle estimation error by 21.9% and the root mean square error by 25.0% compared with the original model, and maintain a 700 + speedup compared with the existing gate-level power analysis technique.
基金funded by the Science and Industry Endowment Fund as part of The Distal Footprints of Giant Ore Systems:UNCOVER Australia Project(RP04-063)-Capricorn Distal Footprints。
文摘Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.
基金The Earmarked Fund for Modern Agro-Industry Technology Research System under contract No.CARS-47-G01the Ao Shan Talents Cultivation Program supported by Qingdao National Laboratory for Marine Science and Technology under contract No.2017ASTCP-OS04+3 种基金the Key Research and Development Plan of Shandong under contract No.2016GSF115019the Agricultural Fine Breed Project of Shandong under contract No.2016LZGC031Chinese Acdemy of Fishery Sciences Basal Research Fund under contract No.2016HY-JC0301the Special Financial Grant from the China Postdoctoral Science Foundation under contract No.2016T90661
文摘In order to carry out the genetic improvement of turbot upper thermal tolerance, it is necessary to estimate the genetic parameters of UTT(upper thermal tolerance) and growth-related traits. The objective of this study was to estimate genetic parameters for BW(body weight) and UTT in a two-generational turbot(Scophthalmus maximus L.) pedigree derived from four imported turbot stocks(England, France, Denmark and Norway). A total of 42 families including 20 families from G1 generation and 22 families from G2 generation were used to test upper thermal tolerance(40–50 animals per family) in this study and the body weight of individuals were measured. The heritability of BW and UTT and the correlation between these two traits were estimated based on an individual animal model using Bayesian method based on two types of animal models with and without maternal effects.These results showed that the heritabilities for BW and UTT and phenotypic and genetic correlations between the two traits estimated from model without maternal effects were 0.239±0.141, 0.111±0.080, 0.075±0.026 and–0.019±0.011, respectively. The corresponding values from model with maternal effects were 0.203±0.115,0.055±0.026, 0.047±0.034 and –0.024±0.028, respectively. The maternal effects of BW and UTT were 0.050±0.017 and 0.013±0.004, respectively. The maternal effects had a certain influence on the genetic evaluation of the two traits. The findings of this paper provided the necessary background to determine the best selection strategy to be adopted in the genetic improvement program.
基金the Army Scientific Research(KYSZJWJK1744,012016012600B11403).
文摘Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.
基金the National Natural Science Foundation of China No.61502528.
文摘Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.
基金support from the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z020006)the National Natural Science Foundation of China(Grant Nos.U1765206 and 51874119).
文摘Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.
基金We acknowledge the support of Defense Advanced Research Projects Agency(Grant HR00111990S2)Toyota Research Institute(Award#849910).
文摘We present two approaches to system identification, i.e. the identification of partial differentialequations (PDEs) from measurement data. The first is a regression-based variational systemidentification procedure that is advantageous in not requiring repeated forward model solves andhas good scalability to large number of differential operators. However it has strict data typerequirements needing the ability to directly represent the operators through the available data.The second is a Bayesian inference framework highly valuable for providing uncertaintyquantification, and flexible for accommodating sparse and noisy data that may also be indirectquantities of interest. However, it also requires repeated forward solutions of the PDE modelswhich is expensive and hinders scalability. We provide illustrations of results on a model problemfor pattern formation dynamics, and discuss merits of the presented methods.
基金supported by the National Natural Science Foundation of China(51775090)。
文摘Due to the simplicity and flexibility of the power law process,it is widely used to model the failures of repairable systems.Although statistical inference on the parameters of the power law process has been well developed,numerous studies largely depend on complete failure data.A few methods on incomplete data are reported to process such data,but they are limited to their specific cases,especially to that where missing data occur at the early stage of the failures.No framework to handle generic scenarios is available.To overcome this problem,from the point of view of order statistics,the statistical inference of the power law process with incomplete data is established in this paper.The theoretical derivation is carried out and the case studies demonstrate and verify the proposed method.Order statistics offer an alternative to the statistical inference of the power law process with incomplete data as they can reformulate current studies on the left censored failure data and interval censored data in a unified framework.The results show that the proposed method has more flexibility and more applicability.