In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relat...In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relationship between penalized splines and mixed models theory. These approaches are also motivated by the possibility of using automatic procedures for determining the optimal amount of smoothing. However, estimation algorithms involve an analytically intractable hazard function, and thus require ad-hoc software routines. We propose a more user-friendly alternative, consisting in regularized estimation of piecewise exponential models by Bayesian P-splines. A further facilitation is that widespread Bayesian software, such as WinBUGS, can be used. The aim is assessing the robustness of this approach with respect to different prior functions and penalties. A large dataset from breast cancer patients, where results from validated clinical studies are available, is used as a benchmark to evaluate the reliability of the estimates. A second dataset from a small case series of sarcoma patients is used for evaluating the performances of the PE model as a tool for exploratory analysis. Concerning breast cancer data, the estimates are robust with respect to priors and penalties, and consistent with clinical knowledge. Concerning soft tissue sarcoma data, the estimates of the hazard function are sensitive with respect to the prior for the smoothing parameter, whereas the estimates of regression coefficients are robust. In conclusion, Gibbs sampling results an efficient computational strategy. The issue of the sensitivity with respect to the priors concerns only the estimates of the hazard function, and seems more likely to occur when non-large case series are investigated, calling for tailored solutions.展开更多
Magnesium(Mg),being the lightest structural metal,holds immense potential for widespread applications in various fields.The development of high-performance and cost-effective Mg alloys is crucial to further advancing ...Magnesium(Mg),being the lightest structural metal,holds immense potential for widespread applications in various fields.The development of high-performance and cost-effective Mg alloys is crucial to further advancing their commercial utilization.With the rapid advancement of machine learning(ML)technology in recent years,the“data-driven''approach for alloy design has provided new perspectives and opportunities for enhancing the performance of Mg alloys.This paper introduces a novel regression-based Bayesian optimization active learning model(RBOALM)for the development of high-performance Mg-Mn-based wrought alloys.RBOALM employs active learning to automatically explore optimal alloy compositions and process parameters within predefined ranges,facilitating the discovery of superior alloy combinations.This model further integrates pre-established regression models as surrogate functions in Bayesian optimization,significantly enhancing the precision of the design process.Leveraging RBOALM,several new high-performance alloys have been successfully designed and prepared.Notably,after mechanical property testing of the designed alloys,the Mg-2.1Zn-2.0Mn-0.5Sn-0.1Ca alloy demonstrates exceptional mechanical properties,including an ultimate tensile strength of 406 MPa,a yield strength of 287 MPa,and a 23%fracture elongation.Furthermore,the Mg-2.7Mn-0.5Al-0.1Ca alloy exhibits an ultimate tensile strength of 211 MPa,coupled with a remarkable 41%fracture elongation.展开更多
Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantita...Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.展开更多
According to the most recent Pteridophyte Phylogeny Group (PPG), eupolypods, or eupolypod ferns, are the most differentiated and diversified of all major lineages of ferns, accounting for more than half of extant fern...According to the most recent Pteridophyte Phylogeny Group (PPG), eupolypods, or eupolypod ferns, are the most differentiated and diversified of all major lineages of ferns, accounting for more than half of extant fern diversity. However, the evolutionary history of eupolypods remains incompletely understood, and conflicting ideas and scenarios exist in the literature about many aspects of this history. Due to a scarce fossil record, the diversification time of eupolypods mainly inferred from molecular dating approaches. Currently, there are two molecular dating results: the diversification of eupolypods occurred either in the Late Cretaceous or as early as in the Jurassic. This study uses the Bayesian tip-dating approach for the first time to infer the diversification time for eupolypods. Our analyses support the Jurassic diversification for eupolypods. The age estimations for the diversifications of the whole clade and one of its two subclades (the eupolypods II) are both in the Jurassic, which adds to the growing body of data on a much earlier diversification of Polypodiales in the Mesozoic than previously suspected.展开更多
Recently,the application of Bayesian updating to predict excavation-induced deformation has proven successful and improved prediction accuracy significantly.However,updating the ground settlement profile,which is cruc...Recently,the application of Bayesian updating to predict excavation-induced deformation has proven successful and improved prediction accuracy significantly.However,updating the ground settlement profile,which is crucial for determining potential damage to nearby infrastructures,has received limited attention.To address this,this paper proposes a physics-guided simplified model combined with a Bayesian updating framework to accurately predict the ground settlement profile.The advantage of this model is that it eliminates the need for complex finite element modeling and makes the updating framework user-friendly.Furthermore,the model is physically interpretable,which can provide valuable references for construction adjustments.The effectiveness of the proposed method is demonstrated through two field case studies,showing that it can yield satisfactory predictions for the settlement profile.展开更多
Statistical biases may be introduced by imprecisely quantifying background radiation reference levels. It is, therefore, imperative to devise a simple, adaptable approach for precisely describing the reference backgro...Statistical biases may be introduced by imprecisely quantifying background radiation reference levels. It is, therefore, imperative to devise a simple, adaptable approach for precisely describing the reference background levels of naturally occurring radionuclides (NOR) in mining sites. As a substitute statistical method, we suggest using Bayesian modeling in this work to examine the spatial distribution of NOR. For naturally occurring gamma-induced radionuclides like 232Th, 40K, and 238U, statistical parameters are inferred using the Markov Chain Monte Carlo (MCMC) method. After obtaining an accurate subsample using bootstrapping, we exclude any possible outliers that fall outside of the Highest Density Interval (HDI). We use MCMC to build a Bayesian model with the resampled data and make predictions about the posterior distribution of radionuclides produced by gamma irradiation. This method offers a strong and dependable way to describe NOR reference background values, which is important for managing and evaluating radiation risks in mining contexts.展开更多
We apply stochastic seismic inversion and Bayesian facies classification for porosity modeling and igneous rock identification in the presalt interval of the Santos Basin. This integration of seismic and well-derived ...We apply stochastic seismic inversion and Bayesian facies classification for porosity modeling and igneous rock identification in the presalt interval of the Santos Basin. This integration of seismic and well-derived information enhances reservoir characterization. Stochastic inversion and Bayesian classification are powerful tools because they permit addressing the uncertainties in the model. We used the ES-MDA algorithm to achieve the realizations equivalent to the percentiles P10, P50, and P90 of acoustic impedance, a novel method for acoustic inversion in presalt. The facies were divided into five: reservoir 1,reservoir 2, tight carbonates, clayey rocks, and igneous rocks. To deal with the overlaps in acoustic impedance values of facies, we included geological information using a priori probability, indicating that structural highs are reservoir-dominated. To illustrate our approach, we conducted porosity modeling using facies-related rock-physics models for rock-physics inversion in an area with a well drilled in a coquina bank and evaluated the thickness and extension of an igneous intrusion near the carbonate-salt interface. The modeled porosity and the classified seismic facies are in good agreement with the ones observed in the wells. Notably, the coquinas bank presents an improvement in the porosity towards the top. The a priori probability model was crucial for limiting the clayey rocks to the structural lows. In Well B, the hit rate of the igneous rock in the three scenarios is higher than 60%, showing an excellent thickness-prediction capability.展开更多
Indoor localization systems are crucial in addressing the limitations of traditional global positioning system(GPS)in indoor environments due to signal attenuation issues.As complex indoor spaces become more sophistic...Indoor localization systems are crucial in addressing the limitations of traditional global positioning system(GPS)in indoor environments due to signal attenuation issues.As complex indoor spaces become more sophisticated,indoor localization systems become essential for improving user experience,safety,and operational efficiency.Indoor localization methods based on Wi-Fi fingerprints require a high-density location fingerprint database,but this can increase the computational burden in the online phase.Bayesian networks,which integrate prior knowledge or domain expertise,are an effective solution for accurately determining indoor user locations.These networks use probabilistic reasoning to model relationships among various localization parameters for indoor environments that are challenging to navigate.This article proposes an adaptive Bayesian model for multi-floor environments based on fingerprinting techniques to minimize errors in estimating user location.The proposed system is an off-the-shelf solution that uses existing Wi-Fi infrastructures to estimate user’s location.It operates in both online and offline phases.In the offline phase,a mobile device with Wi-Fi capability collects radio signals,while in the online phase,generating samples using Gibbs sampling based on the proposed Bayesian model and radio map to predict user’s location.Experimental results unequivocally showcase the superior performance of the proposed model when compared to other existing models and methods.The proposed model achieved an impressive lower average localization error,surpassing the accuracy of competing approaches.Notably,this noteworthy achievement was attained with minimal reliance on reference points,underscoring the efficiency and efficacy of the proposed model in accurately estimating user locations in indoor environments.展开更多
An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to rec...An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.展开更多
The multi-source passive localization problem is a problem of great interest in signal pro-cessing with many applications.In this paper,a sparse representation model based on covariance matrix is constructed for the l...The multi-source passive localization problem is a problem of great interest in signal pro-cessing with many applications.In this paper,a sparse representation model based on covariance matrix is constructed for the long-range localization scenario,and a sparse Bayesian learning algo-rithm based on Laplace prior of signal covariance is developed for the base mismatch problem caused by target deviation from the initial point grid.An adaptive grid sparse Bayesian learning targets localization(AGSBL)algorithm is proposed.The AGSBL algorithm implements a covari-ance-based sparse signal reconstruction and grid adaptive localization dictionary learning.Simula-tion results show that the AGSBL algorithm outperforms the traditional compressed-aware localiza-tion algorithm for different signal-to-noise ratios and different number of targets in long-range scenes.展开更多
Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical appl...Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.展开更多
Stable water isotopes are natural tracers quantifying the contribution of moisture recycling to local precipitation,i.e.,the moisture recycling ratio,but various isotope-based models usually lead to different results,...Stable water isotopes are natural tracers quantifying the contribution of moisture recycling to local precipitation,i.e.,the moisture recycling ratio,but various isotope-based models usually lead to different results,which affects the accuracy of local moisture recycling.In this study,a total of 18 stations from four typical areas in China were selected to compare the performance of isotope-based linear and Bayesian mixing models and to determine local moisture recycling ratio.Among the three vapor sources including advection,transpiration,and surface evaporation,the advection vapor usually played a dominant role,and the contribution of surface evaporation was less than that of transpiration.When the abnormal values were ignored,the arithmetic averages of differences between isotope-based linear and the Bayesian mixing models were 0.9%for transpiration,0.2%for surface evaporation,and–1.1%for advection,respectively,and the medians were 0.5%,0.2%,and–0.8%,respectively.The importance of transpiration was slightly less for most cases when the Bayesian mixing model was applied,and the contribution of advection was relatively larger.The Bayesian mixing model was found to perform better in determining an efficient solution since linear model sometimes resulted in negative contribution ratios.Sensitivity test with two isotope scenarios indicated that the Bayesian model had a relatively low sensitivity to the changes in isotope input,and it was important to accurately estimate the isotopes in precipitation vapor.Generally,the Bayesian mixing model should be recommended instead of a linear model.The findings are useful for understanding the performance of isotope-based linear and Bayesian mixing models under various climate backgrounds.展开更多
Bayesian networks are a powerful class of graphical decision models used to represent causal relationships among variables.However,the reliability and integrity of learned Bayesian network models are highly dependent ...Bayesian networks are a powerful class of graphical decision models used to represent causal relationships among variables.However,the reliability and integrity of learned Bayesian network models are highly dependent on the quality of incoming data streams.One of the primary challenges with Bayesian networks is their vulnerability to adversarial data poisoning attacks,wherein malicious data is injected into the training dataset to negatively influence the Bayesian network models and impair their performance.In this research paper,we propose an efficient framework for detecting data poisoning attacks against Bayesian network structure learning algorithms.Our framework utilizes latent variables to quantify the amount of belief between every two nodes in each causal model over time.We use our innovative methodology to tackle an important issue with data poisoning assaults in the context of Bayesian networks.With regard to four different forms of data poisoning attacks,we specifically aim to strengthen the security and dependability of Bayesian network structure learning techniques,such as the PC algorithm.By doing this,we explore the complexity of this area and offer workablemethods for identifying and reducing these sneaky dangers.Additionally,our research investigates one particular use case,the“Visit to Asia Network.”The practical consequences of using uncertainty as a way to spot cases of data poisoning are explored in this inquiry,which is of utmost relevance.Our results demonstrate the promising efficacy of latent variables in detecting and mitigating the threat of data poisoning attacks.Additionally,our proposed latent-based framework proves to be sensitive in detecting malicious data poisoning attacks in the context of stream data.展开更多
How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS consi...How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS considering the credibility of simulation system based on Bayesian theory is proposed in this paper.First of all,a comprehensive index system for the credibility of the simulation system of the firing precision of the MLRS is constructed combined with the group analytic hierarchy process.A modified method for determining the comprehensive weight of the index is established to improve the rationality of the index weight coefficients.The Bayesian posterior estimation formula of firing precision considering prior information is derived in the form of mixed prior distribution,and the rationality of prior information used in estimation model is discussed quantitatively.With the simulation tests,the different evaluation methods are compared to validate the effectiveness of the proposed method.Finally,the experimental results show that the effectiveness of estimation method for firing precision is improved by more than 25%.展开更多
The topic of this article is one-sided hypothesis testing for disparity, i.e., the mean of one group is larger than that of another when there is uncertainty as to which group a datum is drawn. For each datum, the unc...The topic of this article is one-sided hypothesis testing for disparity, i.e., the mean of one group is larger than that of another when there is uncertainty as to which group a datum is drawn. For each datum, the uncertainty is captured with a given discrete probability distribution over the groups. Such situations arise, for example, in the use of Bayesian imputation methods to assess race and ethnicity disparities with certain insurance, health, and financial data. A widely used method to implement this assessment is the Bayesian Improved Surname Geocoding (BISG) method which assigns a discrete probability over six race/ethnicity groups to an individual given the individual’s surname and address location. Using a Bayesian framework and Markov Chain Monte Carlo sampling from the joint posterior distribution of the group means, the probability of a disparity hypothesis is estimated. Four methods are developed and compared with an illustrative data set. Three of these methods are implemented in an R-code and one method in WinBUGS. These methods are programed for any number of groups between two and six inclusive. All the codes are provided in the appendices.展开更多
BACKGROUND Portal hypertension(PHT),primarily induced by cirrhosis,manifests severe symptoms impacting patient survival.Although transjugular intrahepatic portosystemic shunt(TIPS)is a critical intervention for managi...BACKGROUND Portal hypertension(PHT),primarily induced by cirrhosis,manifests severe symptoms impacting patient survival.Although transjugular intrahepatic portosystemic shunt(TIPS)is a critical intervention for managing PHT,it carries risks like hepatic encephalopathy,thus affecting patient survival prognosis.To our knowledge,existing prognostic models for post-TIPS survival in patients with PHT fail to account for the interplay among and collective impact of various prognostic factors on outcomes.Consequently,the development of an innovative modeling approach is essential to address this limitation.AIM To develop and validate a Bayesian network(BN)-based survival prediction model for patients with cirrhosis-induced PHT having undergone TIPS.METHODS The clinical data of 393 patients with cirrhosis-induced PHT who underwent TIPS surgery at the Second Affiliated Hospital of Chongqing Medical University between January 2015 and May 2022 were retrospectively analyzed.Variables were selected using Cox and least absolute shrinkage and selection operator regression methods,and a BN-based model was established and evaluated to predict survival in patients having undergone TIPS surgery for PHT.RESULTS Variable selection revealed the following as key factors impacting survival:age,ascites,hypertension,indications for TIPS,postoperative portal vein pressure(post-PVP),aspartate aminotransferase,alkaline phosphatase,total bilirubin,prealbumin,the Child-Pugh grade,and the model for end-stage liver disease(MELD)score.Based on the above-mentioned variables,a BN-based 2-year survival prognostic prediction model was constructed,which identified the following factors to be directly linked to the survival time:age,ascites,indications for TIPS,concurrent hypertension,post-PVP,the Child-Pugh grade,and the MELD score.The Bayesian information criterion was 3589.04,and 10-fold cross-validation indicated an average log-likelihood loss of 5.55 with a standard deviation of 0.16.The model’s accuracy,precision,recall,and F1 score were 0.90,0.92,0.97,and 0.95 respectively,with the area under the receiver operating characteristic curve being 0.72.CONCLUSION This study successfully developed a BN-based survival prediction model with good predictive capabilities.It offers valuable insights for treatment strategies and prognostic evaluations in patients having undergone TIPS surgery for PHT.展开更多
The state of in situ stress is a crucial parameter in subsurface engineering,especially for critical projects like nuclear waste repository.As one of the two ISRM suggested methods,the overcoring(OC)method is widely u...The state of in situ stress is a crucial parameter in subsurface engineering,especially for critical projects like nuclear waste repository.As one of the two ISRM suggested methods,the overcoring(OC)method is widely used to estimate the full stress tensors in rocks by independent regression analysis of the data from each OC test.However,such customary independent analysis of individual OC tests,known as no pooling,is liable to yield unreliable test-specific stress estimates due to various uncertainty sources involved in the OC method.To address this problem,a practical and no-cost solution is considered by incorporating into OC data analysis additional information implied within adjacent OC tests,which are usually available in OC measurement campaigns.Hence,this paper presents a Bayesian partial pooling(hierarchical)model for combined analysis of adjacent OC tests.We performed five case studies using OC test data made at a nuclear waste repository research site of Sweden.The results demonstrate that partial pooling of adjacent OC tests indeed allows borrowing of information across adjacent tests,and yields improved stress tensor estimates with reduced uncertainties simultaneously for all individual tests than they are independently analysed as no pooling,particularly for those unreliable no pooling stress estimates.A further model comparison shows that the partial pooling model also gives better predictive performance,and thus confirms that the information borrowed across adjacent OC tests is relevant and effective.展开更多
文摘In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relationship between penalized splines and mixed models theory. These approaches are also motivated by the possibility of using automatic procedures for determining the optimal amount of smoothing. However, estimation algorithms involve an analytically intractable hazard function, and thus require ad-hoc software routines. We propose a more user-friendly alternative, consisting in regularized estimation of piecewise exponential models by Bayesian P-splines. A further facilitation is that widespread Bayesian software, such as WinBUGS, can be used. The aim is assessing the robustness of this approach with respect to different prior functions and penalties. A large dataset from breast cancer patients, where results from validated clinical studies are available, is used as a benchmark to evaluate the reliability of the estimates. A second dataset from a small case series of sarcoma patients is used for evaluating the performances of the PE model as a tool for exploratory analysis. Concerning breast cancer data, the estimates are robust with respect to priors and penalties, and consistent with clinical knowledge. Concerning soft tissue sarcoma data, the estimates of the hazard function are sensitive with respect to the prior for the smoothing parameter, whereas the estimates of regression coefficients are robust. In conclusion, Gibbs sampling results an efficient computational strategy. The issue of the sensitivity with respect to the priors concerns only the estimates of the hazard function, and seems more likely to occur when non-large case series are investigated, calling for tailored solutions.
基金supported by the National Natural the Science Foundation of China(51971042,51901028)the Chongqing Academician Special Fund(cstc2020yszxjcyj X0001)+1 种基金the China Scholarship Council(CSC)Norwegian University of Science and Technology(NTNU)for their financial and technical support。
文摘Magnesium(Mg),being the lightest structural metal,holds immense potential for widespread applications in various fields.The development of high-performance and cost-effective Mg alloys is crucial to further advancing their commercial utilization.With the rapid advancement of machine learning(ML)technology in recent years,the“data-driven''approach for alloy design has provided new perspectives and opportunities for enhancing the performance of Mg alloys.This paper introduces a novel regression-based Bayesian optimization active learning model(RBOALM)for the development of high-performance Mg-Mn-based wrought alloys.RBOALM employs active learning to automatically explore optimal alloy compositions and process parameters within predefined ranges,facilitating the discovery of superior alloy combinations.This model further integrates pre-established regression models as surrogate functions in Bayesian optimization,significantly enhancing the precision of the design process.Leveraging RBOALM,several new high-performance alloys have been successfully designed and prepared.Notably,after mechanical property testing of the designed alloys,the Mg-2.1Zn-2.0Mn-0.5Sn-0.1Ca alloy demonstrates exceptional mechanical properties,including an ultimate tensile strength of 406 MPa,a yield strength of 287 MPa,and a 23%fracture elongation.Furthermore,the Mg-2.7Mn-0.5Al-0.1Ca alloy exhibits an ultimate tensile strength of 211 MPa,coupled with a remarkable 41%fracture elongation.
基金supported by the National Natural Science Foundation of China(Nos.52279107 and 52379106)the Qingdao Guoxin Jiaozhou Bay Second Submarine Tunnel Co.,Ltd.,the Academician and Expert Workstation of Yunnan Province(No.202205AF150015)the Science and Technology Innovation Project of YCIC Group Co.,Ltd.(No.YCIC-YF-2022-15)。
文摘Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.
文摘According to the most recent Pteridophyte Phylogeny Group (PPG), eupolypods, or eupolypod ferns, are the most differentiated and diversified of all major lineages of ferns, accounting for more than half of extant fern diversity. However, the evolutionary history of eupolypods remains incompletely understood, and conflicting ideas and scenarios exist in the literature about many aspects of this history. Due to a scarce fossil record, the diversification time of eupolypods mainly inferred from molecular dating approaches. Currently, there are two molecular dating results: the diversification of eupolypods occurred either in the Late Cretaceous or as early as in the Jurassic. This study uses the Bayesian tip-dating approach for the first time to infer the diversification time for eupolypods. Our analyses support the Jurassic diversification for eupolypods. The age estimations for the diversifications of the whole clade and one of its two subclades (the eupolypods II) are both in the Jurassic, which adds to the growing body of data on a much earlier diversification of Polypodiales in the Mesozoic than previously suspected.
基金the financial support from the Guangdong Provincial Department of Science and Technology(Grant No.2022A0505030019)the Science and Technology Development Fund,Macao SAR,China(File Nos.0056/2023/RIB2 and SKL-IOTSC-2021-2023).
文摘Recently,the application of Bayesian updating to predict excavation-induced deformation has proven successful and improved prediction accuracy significantly.However,updating the ground settlement profile,which is crucial for determining potential damage to nearby infrastructures,has received limited attention.To address this,this paper proposes a physics-guided simplified model combined with a Bayesian updating framework to accurately predict the ground settlement profile.The advantage of this model is that it eliminates the need for complex finite element modeling and makes the updating framework user-friendly.Furthermore,the model is physically interpretable,which can provide valuable references for construction adjustments.The effectiveness of the proposed method is demonstrated through two field case studies,showing that it can yield satisfactory predictions for the settlement profile.
文摘Statistical biases may be introduced by imprecisely quantifying background radiation reference levels. It is, therefore, imperative to devise a simple, adaptable approach for precisely describing the reference background levels of naturally occurring radionuclides (NOR) in mining sites. As a substitute statistical method, we suggest using Bayesian modeling in this work to examine the spatial distribution of NOR. For naturally occurring gamma-induced radionuclides like 232Th, 40K, and 238U, statistical parameters are inferred using the Markov Chain Monte Carlo (MCMC) method. After obtaining an accurate subsample using bootstrapping, we exclude any possible outliers that fall outside of the Highest Density Interval (HDI). We use MCMC to build a Bayesian model with the resampled data and make predictions about the posterior distribution of radionuclides produced by gamma irradiation. This method offers a strong and dependable way to describe NOR reference background values, which is important for managing and evaluating radiation risks in mining contexts.
基金Equinor for financing the R&D projectthe Institute of Science and Technology of Petroleum Geophysics of Brazil for supporting this research。
文摘We apply stochastic seismic inversion and Bayesian facies classification for porosity modeling and igneous rock identification in the presalt interval of the Santos Basin. This integration of seismic and well-derived information enhances reservoir characterization. Stochastic inversion and Bayesian classification are powerful tools because they permit addressing the uncertainties in the model. We used the ES-MDA algorithm to achieve the realizations equivalent to the percentiles P10, P50, and P90 of acoustic impedance, a novel method for acoustic inversion in presalt. The facies were divided into five: reservoir 1,reservoir 2, tight carbonates, clayey rocks, and igneous rocks. To deal with the overlaps in acoustic impedance values of facies, we included geological information using a priori probability, indicating that structural highs are reservoir-dominated. To illustrate our approach, we conducted porosity modeling using facies-related rock-physics models for rock-physics inversion in an area with a well drilled in a coquina bank and evaluated the thickness and extension of an igneous intrusion near the carbonate-salt interface. The modeled porosity and the classified seismic facies are in good agreement with the ones observed in the wells. Notably, the coquinas bank presents an improvement in the porosity towards the top. The a priori probability model was crucial for limiting the clayey rocks to the structural lows. In Well B, the hit rate of the igneous rock in the three scenarios is higher than 60%, showing an excellent thickness-prediction capability.
基金This work was supported and funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(Grant Number IMSIU-RPP2023011).
文摘Indoor localization systems are crucial in addressing the limitations of traditional global positioning system(GPS)in indoor environments due to signal attenuation issues.As complex indoor spaces become more sophisticated,indoor localization systems become essential for improving user experience,safety,and operational efficiency.Indoor localization methods based on Wi-Fi fingerprints require a high-density location fingerprint database,but this can increase the computational burden in the online phase.Bayesian networks,which integrate prior knowledge or domain expertise,are an effective solution for accurately determining indoor user locations.These networks use probabilistic reasoning to model relationships among various localization parameters for indoor environments that are challenging to navigate.This article proposes an adaptive Bayesian model for multi-floor environments based on fingerprinting techniques to minimize errors in estimating user location.The proposed system is an off-the-shelf solution that uses existing Wi-Fi infrastructures to estimate user’s location.It operates in both online and offline phases.In the offline phase,a mobile device with Wi-Fi capability collects radio signals,while in the online phase,generating samples using Gibbs sampling based on the proposed Bayesian model and radio map to predict user’s location.Experimental results unequivocally showcase the superior performance of the proposed model when compared to other existing models and methods.The proposed model achieved an impressive lower average localization error,surpassing the accuracy of competing approaches.Notably,this noteworthy achievement was attained with minimal reliance on reference points,underscoring the efficiency and efficacy of the proposed model in accurately estimating user locations in indoor environments.
基金supported by the National MCF Energy R&D Program of China (Nos. 2018 YFE0301105, 2022YFE03010002 and 2018YFE0302100)the National Key R&D Program of China (Nos. 2022YFE03070004 and 2022YFE03070000)National Natural Science Foundation of China (Nos. 12205195, 12075155 and 11975277)
文摘An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.
文摘The multi-source passive localization problem is a problem of great interest in signal pro-cessing with many applications.In this paper,a sparse representation model based on covariance matrix is constructed for the long-range localization scenario,and a sparse Bayesian learning algo-rithm based on Laplace prior of signal covariance is developed for the base mismatch problem caused by target deviation from the initial point grid.An adaptive grid sparse Bayesian learning targets localization(AGSBL)algorithm is proposed.The AGSBL algorithm implements a covari-ance-based sparse signal reconstruction and grid adaptive localization dictionary learning.Simula-tion results show that the AGSBL algorithm outperforms the traditional compressed-aware localiza-tion algorithm for different signal-to-noise ratios and different number of targets in long-range scenes.
基金supported by the National Science Foundation of China(42107183).
文摘Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.
基金This study was supported by the National Natural Science Foundation of China(42261008,41971034)the Natural Science Foundation of Gansu Province,China(22JR5RA074).
文摘Stable water isotopes are natural tracers quantifying the contribution of moisture recycling to local precipitation,i.e.,the moisture recycling ratio,but various isotope-based models usually lead to different results,which affects the accuracy of local moisture recycling.In this study,a total of 18 stations from four typical areas in China were selected to compare the performance of isotope-based linear and Bayesian mixing models and to determine local moisture recycling ratio.Among the three vapor sources including advection,transpiration,and surface evaporation,the advection vapor usually played a dominant role,and the contribution of surface evaporation was less than that of transpiration.When the abnormal values were ignored,the arithmetic averages of differences between isotope-based linear and the Bayesian mixing models were 0.9%for transpiration,0.2%for surface evaporation,and–1.1%for advection,respectively,and the medians were 0.5%,0.2%,and–0.8%,respectively.The importance of transpiration was slightly less for most cases when the Bayesian mixing model was applied,and the contribution of advection was relatively larger.The Bayesian mixing model was found to perform better in determining an efficient solution since linear model sometimes resulted in negative contribution ratios.Sensitivity test with two isotope scenarios indicated that the Bayesian model had a relatively low sensitivity to the changes in isotope input,and it was important to accurately estimate the isotopes in precipitation vapor.Generally,the Bayesian mixing model should be recommended instead of a linear model.The findings are useful for understanding the performance of isotope-based linear and Bayesian mixing models under various climate backgrounds.
文摘Bayesian networks are a powerful class of graphical decision models used to represent causal relationships among variables.However,the reliability and integrity of learned Bayesian network models are highly dependent on the quality of incoming data streams.One of the primary challenges with Bayesian networks is their vulnerability to adversarial data poisoning attacks,wherein malicious data is injected into the training dataset to negatively influence the Bayesian network models and impair their performance.In this research paper,we propose an efficient framework for detecting data poisoning attacks against Bayesian network structure learning algorithms.Our framework utilizes latent variables to quantify the amount of belief between every two nodes in each causal model over time.We use our innovative methodology to tackle an important issue with data poisoning assaults in the context of Bayesian networks.With regard to four different forms of data poisoning attacks,we specifically aim to strengthen the security and dependability of Bayesian network structure learning techniques,such as the PC algorithm.By doing this,we explore the complexity of this area and offer workablemethods for identifying and reducing these sneaky dangers.Additionally,our research investigates one particular use case,the“Visit to Asia Network.”The practical consequences of using uncertainty as a way to spot cases of data poisoning are explored in this inquiry,which is of utmost relevance.Our results demonstrate the promising efficacy of latent variables in detecting and mitigating the threat of data poisoning attacks.Additionally,our proposed latent-based framework proves to be sensitive in detecting malicious data poisoning attacks in the context of stream data.
基金National Natural Science Foundation of China(Grant Nos.11972193 and 92266201)。
文摘How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS considering the credibility of simulation system based on Bayesian theory is proposed in this paper.First of all,a comprehensive index system for the credibility of the simulation system of the firing precision of the MLRS is constructed combined with the group analytic hierarchy process.A modified method for determining the comprehensive weight of the index is established to improve the rationality of the index weight coefficients.The Bayesian posterior estimation formula of firing precision considering prior information is derived in the form of mixed prior distribution,and the rationality of prior information used in estimation model is discussed quantitatively.With the simulation tests,the different evaluation methods are compared to validate the effectiveness of the proposed method.Finally,the experimental results show that the effectiveness of estimation method for firing precision is improved by more than 25%.
文摘The topic of this article is one-sided hypothesis testing for disparity, i.e., the mean of one group is larger than that of another when there is uncertainty as to which group a datum is drawn. For each datum, the uncertainty is captured with a given discrete probability distribution over the groups. Such situations arise, for example, in the use of Bayesian imputation methods to assess race and ethnicity disparities with certain insurance, health, and financial data. A widely used method to implement this assessment is the Bayesian Improved Surname Geocoding (BISG) method which assigns a discrete probability over six race/ethnicity groups to an individual given the individual’s surname and address location. Using a Bayesian framework and Markov Chain Monte Carlo sampling from the joint posterior distribution of the group means, the probability of a disparity hypothesis is estimated. Four methods are developed and compared with an illustrative data set. Three of these methods are implemented in an R-code and one method in WinBUGS. These methods are programed for any number of groups between two and six inclusive. All the codes are provided in the appendices.
基金Supported by the Chinese Nursing Association,No.ZHKY202111Scientific Research Program of School of Nursing,Chongqing Medical University,No.20230307Chongqing Science and Health Joint Medical Research Program,No.2024MSXM063.
文摘BACKGROUND Portal hypertension(PHT),primarily induced by cirrhosis,manifests severe symptoms impacting patient survival.Although transjugular intrahepatic portosystemic shunt(TIPS)is a critical intervention for managing PHT,it carries risks like hepatic encephalopathy,thus affecting patient survival prognosis.To our knowledge,existing prognostic models for post-TIPS survival in patients with PHT fail to account for the interplay among and collective impact of various prognostic factors on outcomes.Consequently,the development of an innovative modeling approach is essential to address this limitation.AIM To develop and validate a Bayesian network(BN)-based survival prediction model for patients with cirrhosis-induced PHT having undergone TIPS.METHODS The clinical data of 393 patients with cirrhosis-induced PHT who underwent TIPS surgery at the Second Affiliated Hospital of Chongqing Medical University between January 2015 and May 2022 were retrospectively analyzed.Variables were selected using Cox and least absolute shrinkage and selection operator regression methods,and a BN-based model was established and evaluated to predict survival in patients having undergone TIPS surgery for PHT.RESULTS Variable selection revealed the following as key factors impacting survival:age,ascites,hypertension,indications for TIPS,postoperative portal vein pressure(post-PVP),aspartate aminotransferase,alkaline phosphatase,total bilirubin,prealbumin,the Child-Pugh grade,and the model for end-stage liver disease(MELD)score.Based on the above-mentioned variables,a BN-based 2-year survival prognostic prediction model was constructed,which identified the following factors to be directly linked to the survival time:age,ascites,indications for TIPS,concurrent hypertension,post-PVP,the Child-Pugh grade,and the MELD score.The Bayesian information criterion was 3589.04,and 10-fold cross-validation indicated an average log-likelihood loss of 5.55 with a standard deviation of 0.16.The model’s accuracy,precision,recall,and F1 score were 0.90,0.92,0.97,and 0.95 respectively,with the area under the receiver operating characteristic curve being 0.72.CONCLUSION This study successfully developed a BN-based survival prediction model with good predictive capabilities.It offers valuable insights for treatment strategies and prognostic evaluations in patients having undergone TIPS surgery for PHT.
基金supported by the Guangdong Basic and Applied Basic Research Foundation(2023A1515011244).
文摘The state of in situ stress is a crucial parameter in subsurface engineering,especially for critical projects like nuclear waste repository.As one of the two ISRM suggested methods,the overcoring(OC)method is widely used to estimate the full stress tensors in rocks by independent regression analysis of the data from each OC test.However,such customary independent analysis of individual OC tests,known as no pooling,is liable to yield unreliable test-specific stress estimates due to various uncertainty sources involved in the OC method.To address this problem,a practical and no-cost solution is considered by incorporating into OC data analysis additional information implied within adjacent OC tests,which are usually available in OC measurement campaigns.Hence,this paper presents a Bayesian partial pooling(hierarchical)model for combined analysis of adjacent OC tests.We performed five case studies using OC test data made at a nuclear waste repository research site of Sweden.The results demonstrate that partial pooling of adjacent OC tests indeed allows borrowing of information across adjacent tests,and yields improved stress tensor estimates with reduced uncertainties simultaneously for all individual tests than they are independently analysed as no pooling,particularly for those unreliable no pooling stress estimates.A further model comparison shows that the partial pooling model also gives better predictive performance,and thus confirms that the information borrowed across adjacent OC tests is relevant and effective.