The rejection sampling method is one of the most popular methods used in Monte Carlo methods. It turns out that the standard rejection method is closely related to the problem of quasi-Monte Carlo integration of chara...The rejection sampling method is one of the most popular methods used in Monte Carlo methods. It turns out that the standard rejection method is closely related to the problem of quasi-Monte Carlo integration of characteristic functions, whose accuracy may be lost due to the discontinuity of the characteristic functions. We proposed a B-splines smoothed rejection sampling method, which smoothed the characteristic function by B-splines smoothing technique without changing the integral quantity. Numerical experiments showed that the convergence rate of nearly O( N^-1 ) is regained by using the B-splines smoothed rejection method in importance sampling.展开更多
Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS m...Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS method for the global variance reduction problem based on the AIS method,which was implemented in the Monte Carlo program MCShield.The proposed method was validated using the VENUS-Ⅲ international benchmark problem and a self-shielding calculation example.The results from the VENUS-Ⅲ benchmark problem showed that the grid-AIS method achieved a significant reduction in the variance of the statistical errors of the MESH grids,decreasing from 1.08×10^(-2) to 3.84×10^(-3),representing a 64.00% reduction.This demonstrates that the grid-AIS method is effective in addressing global issues.The results of the selfshielding calculation demonstrate that the grid-AIS method produced accurate computational results.Moreover,the grid-AIS method exhibited a computational efficiency approximately one order of magnitude higher than that of the AIS method and approximately two orders of magnitude higher than that of the conventional Monte Carlo method.展开更多
Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more adva...Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more advantages in the geometric modeling of stochastic media.The explicit modeling method has high computational accuracy and high computational cost.The chord length sampling(CLS)method can improve computational efficiency by sampling the chord length during neutron transport using the matrix chord length?s probability density function.This study shows that the excluded-volume effect in realistic stochastic media can introduce certain deviations into the CLS.A chord length correction approach is proposed to obtain the chord length correction factor by developing the Particle code based on equivalent transmission probability.Through numerical analysis against reference solutions from explicit modeling in the RMC code,it was demonstrated that CLS with the proposed correction method provides good accuracy for addressing the excludedvolume effect in realistic infinite stochastic media.展开更多
Available safety egress time under ship fire (SFAT) is critical to ship fire safety assessment, design and emergency rescue. Although it is available to determine SFAT by using fire models such as the two-zone fire ...Available safety egress time under ship fire (SFAT) is critical to ship fire safety assessment, design and emergency rescue. Although it is available to determine SFAT by using fire models such as the two-zone fire model CFAST and the field model FDS, none of these models can address the uncertainties involved in the input parameters. To solve this problem, current study presents a framework of uncertainty analysis for SFAT. Firstly, a deterministic model estimating SFAT is built. The uncertainties of the input parameters are regarded as random variables with the given probability distribution functions. Subsequently, the deterministic SFAT model is employed to couple with a Monte Carlo sampling method to investigate the uncertainties of the SFAT. The Spearman's rank-order correlation coefficient (SRCC) is used to examine the sensitivity of each input uncertainty parameter on SFAT. To illustrate the proposed approach in detail, a case study is performed. Based on the proposed approach, probability density function and cumulative density function of SFAT are obtained. Furthermore, sensitivity analysis with regard to SFAT is also conducted. The results give a high-negative correlation of SFAT and the fire growth coefficient whereas the effect of other parameters is so weak that they can be neglected.展开更多
A stratified sampling Monte Carlo method to analyze the reliability of structural systems is presented. Introducing a small exploratory simulation, this method overcomes the difficulties for getting the systematic sam...A stratified sampling Monte Carlo method to analyze the reliability of structural systems is presented. Introducing a small exploratory simulation, this method overcomes the difficulties for getting the systematic sampling probability of all the strata. Several useful and efficient stratification methods are given and the strategies of stratification and simulation are studied. A general conclusion has been presented corresponding to actual engineering structures. The strict theoretical proof has been given,and it is especially effective to solve probabilistic integration. Statistic error of evaluating failure probability is reduced obviously. Especially in highly non-linear and nonreonvex problems, it is more accurate than other methods. Compared with other variance reduction techniques, this method can obtain a more obvious variance reduction and an increased sampling efficiency. Moreover, without strict limiting condition, it is convenient to use. This method is especially suitable to solve the reliability problem of structural systems with multiple failure modes and highly non-linear safety margin equations.展开更多
This study investigates whether the implied crude oil volatility and the historical OPEC price volatility can impact the return to and volatility of the energy-sector equity indices in Iran.The analysis specifically c...This study investigates whether the implied crude oil volatility and the historical OPEC price volatility can impact the return to and volatility of the energy-sector equity indices in Iran.The analysis specifically considers the refining,drilling,and petrochemical equity sectors of the Tehran Stock Exchange.The parameter estimation uses the quasi-Monte Carlo and Bayesian optimization methods in the framework of a generalized autoregressive conditional heteroskedasticity model,and a complementary Bayesian network analysis is also conducted.The analysis takes into account geopolitical risk and economic policy uncertainty data as other proxies for uncertainty.This study also aims to detect different price regimes for each equity index in a novel way using homogeneous/non-homogeneous Markov switching autoregressive models.Although these methods provide improvements by restricting the analysis to a specific price-regime period,they produce conflicting results,rendering it impossible to draw general conclusions regarding the contagion effect on returns or the volatility transmission between markets.Nevertheless,the results indicate that the OPEC(historical)price volatility has a stronger effect on the energy sectors than the implied volatility has.These types of oil price shocks are found to have no effect on the drilling sector price pattern,whereas the refining and petrochemical equity sectors do seem to undergo changes in their price patterns nearly concurrently with future demand shocks and oil supply shocks,respectively,gaining dominance in the oil market.展开更多
In diffusion Monte Carlo methods, depending on the geometry continuous diffusion can be simulated in many ways such as walk-on-spheres (WOS), walk-on-planes (WOP), walk-on-rectangles (WOR) and so on. The diffusion way...In diffusion Monte Carlo methods, depending on the geometry continuous diffusion can be simulated in many ways such as walk-on-spheres (WOS), walk-on-planes (WOP), walk-on-rectangles (WOR) and so on. The diffusion ways are conformally the same satisfying the Laplace equation with the given boundary geometry. In this paper, using the WOP and the conformal map, we sample the WOS diffusion and show that the indirect sampling is more efficient than the direct WOS sampling. This signifies that fast diffusion Monte Carlo sampling via conformal map can be possible.展开更多
We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in...We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insumcient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.展开更多
Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept a...Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept and quasi-Monte Carlo simulation. The shear strength reduction FEM is superior to the slice method based on the limit equilibrium theory in many ways, so it will be more powerful to assess the reliability of global slope stability when combined with probability theory. To illustrate the performance of the proposed method, it is applied to an example of simple slope. The results of simulation show that the proposed method is effective to perform the reliability analysis of global slope stability without presupposing a potential slip surface.展开更多
In this project, we consider obtaining Fourier features via more efficient sampling schemes to approximate the kernel in LFMs. A latent force model (LFM) is a Gaussian process whose covariance functions follow an Expo...In this project, we consider obtaining Fourier features via more efficient sampling schemes to approximate the kernel in LFMs. A latent force model (LFM) is a Gaussian process whose covariance functions follow an Exponentiated Quadratic (EQ) form, and the solutions for the cross-covariance are expensive due to the computational complexity. To reduce the complexity of mathematical expressions, random Fourier features (RFF) are applied to approximate the EQ kernel. Usually, the random Fourier features are implemented with Monte Carlo sampling, but this project proposes replacing the Monte-Carlo method with the Quasi-Monte Carlo (QMC) method. The first-order and second-order models’ experiment results demonstrate the decrease in NLPD and NMSE, which revealed that the models with QMC approximation have better performance.展开更多
It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal...It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.展开更多
A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, i...A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.展开更多
The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computi...The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computing efficiency and accuracy of the current analysismethods.In this case,by fitting the implicit limit state function(LSF)with active Kriging(AK)model and reducing candidate sample poolwith adaptive importance sampling(AIS),a novel AK-AIS method is proposed.Herein,theAKmodel andMarkov chainMonte Carlo(MCMC)are first established to identify the most probable failure region(s)(MPFRs),and the adaptive kernel density estimation(AKDE)importance sampling function is constructed to select the candidate samples.With the best samples sequentially attained in the reduced candidate samples and employed to update the Kriging-fitted LSF,the failure probability and sensitivity indices are acquired at a lower cost.The proposed method is verified by twomulti-failure numerical examples,and then applied to the reliability and sensitivity analyses of a typical stator blade regulator.Withmethods comparison,the proposed AK-AIS is proven to hold the computing advantages on accuracy and efficiency in complex reliability and sensitivity analysis problems.展开更多
Most of the maintenance optimization models in condition-based maintenance(CBM) consider the cost-optimal criterion, but few papers have dealt with availability maximization for maintenance applications. A novel optim...Most of the maintenance optimization models in condition-based maintenance(CBM) consider the cost-optimal criterion, but few papers have dealt with availability maximization for maintenance applications. A novel optimal Bayesian control approach is presented for maintenance decision making. The system deterioration evolves as a three-state continuous time hidden semi-Markov process. Considering the optimal maintenance policy, the multivariate Bayesian control scheme based on the hidden semi-Markov model(HSMM) is developed, the objective is to maximize the long-run expected average availability per unit time. The proposed approach can optimize the sampling interval and control limit jointly. A case study using Markov chain Monte Carlo(MCMC)simulation is provided and a comparison with the Bayesian control scheme based on hidden Markov model(HMM), the age-based replacement policy, Hotelling’s T2, multivariate exponentially weihted moving average(MEWMA) and multivariate cumulative sum(MCUSUM) control charts is given, which illustrates the effectiveness of the proposed method.展开更多
This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss functi...This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.展开更多
Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the...Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the point of maximum likelihood. The sampling region is a hyper-ellipsoid that consists of the sampling ellipse on each plane of main curvature in V-space. Thus, the sampling probability density function can be constructed by the sampling region center and ellipsoid axes. Several examples have shown the efficiency and generality of this method.展开更多
In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description o...In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description of the uniform linear array(ULA), a decoupled concentrated likelihood function(CLF) is presented. A new objective function based on CLF which can obtain a closed-form solution of global maximum is constructed according to Pincus theorem. To obtain the optimal value of the objective function which is a complex high-dimensional integral,we propose an importance sampling approach based on Monte Carlo random calculation. Next, an importance function is derived, which can simplify the problem of generating random vector from a high-dimensional probability density function(PDF) to generate random variable from a one-dimensional PDF. Compared with the existing maximum likelihood(ML) algorithms for DOA estimation of ID sources, the proposed algorithm does not require initial estimates, and its performance is closer to CramerRao lower bound(CRLB). The proposed algorithm performs better than the existing methods when the interval between sources to be estimated is small and in low signal to noise ratio(SNR)scenarios.展开更多
Although it is known that exact sampling algorithm is easy to construct and less sensitive to noise, the samples distri- bution of the algorithm deviates from the target states distribution due to the local dependent ...Although it is known that exact sampling algorithm is easy to construct and less sensitive to noise, the samples distri- bution of the algorithm deviates from the target states distribution due to the local dependent coupling problem. A new algorithm, named exact sampling with directional threshold (ES-DT) is intro- duced. The main advantage of the new algorithm, in comparison with the traditional exact sampling algorithm, is that it can control the sampling with a rejection strategy in Markov chain during the path growth, and closely approach the ideal distribution based on maintaining the target density. Simulation experiments show the effectiveness of the proposed algorithm.展开更多
Spatial heterogeneity is an inherent characteristic of natural forest landscapes, therefore estimation of structural variability, including the collection and analyzing of field measurements, is a growing challenge fo...Spatial heterogeneity is an inherent characteristic of natural forest landscapes, therefore estimation of structural variability, including the collection and analyzing of field measurements, is a growing challenge for monitoring wildlife habitat di- versity and ecosystem sustainability. In this study, we investigated the combined influence of plot shape and size on the accuracy of assessment of conventional and rare structural features in two young-growth spruce-dominated forests in northwestern China. We used a series of inventory schemes and analytical approaches. Our data showed that options for sampling protocols, especially the selection of plot size considered in structural attributes measurement, dramatically af- fect the minimum number of plots required to meet a certain accuracy criteria. The degree of influence of plot shape is related to survey objectives; thus, effects of plot shape differ for evaluations of the "mean" or "representative" stand structural conditions from that for the range of habitat (in extreme values). Results of Monte Carlo simulations suggested that plot sizes 〈0.1 ha could be the most efficient way to sample for conventional characteristics (features with relative constancy within a site, such as stem density). Also, 0.25 ha or even larger plots may have a greater likelihood of capturing rare structural attributes (features possessing high randomness and spatial heterogeneity, such as volume of coarse woody debris) in our forest type. These findings have important implications for advisable sampling protocol (plot size and shape) to adequately capture information on forest habitat structure and diversity; such efforts must be based on a clear definition of which types are structural attributes to measure.展开更多
文摘The rejection sampling method is one of the most popular methods used in Monte Carlo methods. It turns out that the standard rejection method is closely related to the problem of quasi-Monte Carlo integration of characteristic functions, whose accuracy may be lost due to the discontinuity of the characteristic functions. We proposed a B-splines smoothed rejection sampling method, which smoothed the characteristic function by B-splines smoothing technique without changing the integral quantity. Numerical experiments showed that the convergence rate of nearly O( N^-1 ) is regained by using the B-splines smoothed rejection method in importance sampling.
基金supported by the Platform Development Foundation of the China Institute for Radiation Protection(No.YP21030101)the National Natural Science Foundation of China(General Program)(Nos.12175114,U2167209)+1 种基金the National Key R&D Program of China(No.2021YFF0603600)the Tsinghua University Initiative Scientific Research Program(No.20211080081).
文摘Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS method for the global variance reduction problem based on the AIS method,which was implemented in the Monte Carlo program MCShield.The proposed method was validated using the VENUS-Ⅲ international benchmark problem and a self-shielding calculation example.The results from the VENUS-Ⅲ benchmark problem showed that the grid-AIS method achieved a significant reduction in the variance of the statistical errors of the MESH grids,decreasing from 1.08×10^(-2) to 3.84×10^(-3),representing a 64.00% reduction.This demonstrates that the grid-AIS method is effective in addressing global issues.The results of the selfshielding calculation demonstrate that the grid-AIS method produced accurate computational results.Moreover,the grid-AIS method exhibited a computational efficiency approximately one order of magnitude higher than that of the AIS method and approximately two orders of magnitude higher than that of the conventional Monte Carlo method.
文摘Dispersion fuels,knowned for their excellent safety performance,are widely used in advanced reactors,such as hightemperature gas-cooled reactors.Compared with deterministic methods,the Monte Carlo method has more advantages in the geometric modeling of stochastic media.The explicit modeling method has high computational accuracy and high computational cost.The chord length sampling(CLS)method can improve computational efficiency by sampling the chord length during neutron transport using the matrix chord length?s probability density function.This study shows that the excluded-volume effect in realistic stochastic media can introduce certain deviations into the CLS.A chord length correction approach is proposed to obtain the chord length correction factor by developing the Particle code based on equivalent transmission probability.Through numerical analysis against reference solutions from explicit modeling in the RMC code,it was demonstrated that CLS with the proposed correction method provides good accuracy for addressing the excludedvolume effect in realistic infinite stochastic media.
基金supported by the National Natural Science Foundation of China (Grant No. 50909058)"Chen Guang" Project of Shanghai Municipal Education Commission and Shanghai Education Development Foundation Science & Technology(Grant No. 10CG51)the Innovation Program of Shanghai Municipal Education Commission (Grant No.11YZ133)
文摘Available safety egress time under ship fire (SFAT) is critical to ship fire safety assessment, design and emergency rescue. Although it is available to determine SFAT by using fire models such as the two-zone fire model CFAST and the field model FDS, none of these models can address the uncertainties involved in the input parameters. To solve this problem, current study presents a framework of uncertainty analysis for SFAT. Firstly, a deterministic model estimating SFAT is built. The uncertainties of the input parameters are regarded as random variables with the given probability distribution functions. Subsequently, the deterministic SFAT model is employed to couple with a Monte Carlo sampling method to investigate the uncertainties of the SFAT. The Spearman's rank-order correlation coefficient (SRCC) is used to examine the sensitivity of each input uncertainty parameter on SFAT. To illustrate the proposed approach in detail, a case study is performed. Based on the proposed approach, probability density function and cumulative density function of SFAT are obtained. Furthermore, sensitivity analysis with regard to SFAT is also conducted. The results give a high-negative correlation of SFAT and the fire growth coefficient whereas the effect of other parameters is so weak that they can be neglected.
文摘A stratified sampling Monte Carlo method to analyze the reliability of structural systems is presented. Introducing a small exploratory simulation, this method overcomes the difficulties for getting the systematic sampling probability of all the strata. Several useful and efficient stratification methods are given and the strategies of stratification and simulation are studied. A general conclusion has been presented corresponding to actual engineering structures. The strict theoretical proof has been given,and it is especially effective to solve probabilistic integration. Statistic error of evaluating failure probability is reduced obviously. Especially in highly non-linear and nonreonvex problems, it is more accurate than other methods. Compared with other variance reduction techniques, this method can obtain a more obvious variance reduction and an increased sampling efficiency. Moreover, without strict limiting condition, it is convenient to use. This method is especially suitable to solve the reliability problem of structural systems with multiple failure modes and highly non-linear safety margin equations.
文摘This study investigates whether the implied crude oil volatility and the historical OPEC price volatility can impact the return to and volatility of the energy-sector equity indices in Iran.The analysis specifically considers the refining,drilling,and petrochemical equity sectors of the Tehran Stock Exchange.The parameter estimation uses the quasi-Monte Carlo and Bayesian optimization methods in the framework of a generalized autoregressive conditional heteroskedasticity model,and a complementary Bayesian network analysis is also conducted.The analysis takes into account geopolitical risk and economic policy uncertainty data as other proxies for uncertainty.This study also aims to detect different price regimes for each equity index in a novel way using homogeneous/non-homogeneous Markov switching autoregressive models.Although these methods provide improvements by restricting the analysis to a specific price-regime period,they produce conflicting results,rendering it impossible to draw general conclusions regarding the contagion effect on returns or the volatility transmission between markets.Nevertheless,the results indicate that the OPEC(historical)price volatility has a stronger effect on the energy sectors than the implied volatility has.These types of oil price shocks are found to have no effect on the drilling sector price pattern,whereas the refining and petrochemical equity sectors do seem to undergo changes in their price patterns nearly concurrently with future demand shocks and oil supply shocks,respectively,gaining dominance in the oil market.
文摘In diffusion Monte Carlo methods, depending on the geometry continuous diffusion can be simulated in many ways such as walk-on-spheres (WOS), walk-on-planes (WOP), walk-on-rectangles (WOR) and so on. The diffusion ways are conformally the same satisfying the Laplace equation with the given boundary geometry. In this paper, using the WOP and the conformal map, we sample the WOS diffusion and show that the indirect sampling is more efficient than the direct WOS sampling. This signifies that fast diffusion Monte Carlo sampling via conformal map can be possible.
基金Supported by the National Natural Science Foundation of China under Grant Nos.10674016,10875013the Specialized Research Foundation for the Doctoral Program of Higher Education under Grant No.20080027005
文摘We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insumcient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.
文摘Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept and quasi-Monte Carlo simulation. The shear strength reduction FEM is superior to the slice method based on the limit equilibrium theory in many ways, so it will be more powerful to assess the reliability of global slope stability when combined with probability theory. To illustrate the performance of the proposed method, it is applied to an example of simple slope. The results of simulation show that the proposed method is effective to perform the reliability analysis of global slope stability without presupposing a potential slip surface.
文摘In this project, we consider obtaining Fourier features via more efficient sampling schemes to approximate the kernel in LFMs. A latent force model (LFM) is a Gaussian process whose covariance functions follow an Exponentiated Quadratic (EQ) form, and the solutions for the cross-covariance are expensive due to the computational complexity. To reduce the complexity of mathematical expressions, random Fourier features (RFF) are applied to approximate the EQ kernel. Usually, the random Fourier features are implemented with Monte Carlo sampling, but this project proposes replacing the Monte-Carlo method with the Quasi-Monte Carlo (QMC) method. The first-order and second-order models’ experiment results demonstrate the decrease in NLPD and NMSE, which revealed that the models with QMC approximation have better performance.
基金The Special Scientific Research Funds for Central Non-profit Institutes(East China Sea Fisheries Research Institute)under contract No.2016T08the National Natural Science Foundation of China under contract No.31400410
文摘It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.
文摘A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.
基金supported by the National Natural Science Foundation of China under Grant Nos.52105136,51975028China Postdoctoral Science Foundation under Grant[No.2021M690290]the National Science and TechnologyMajor Project under Grant No.J2019-IV-0002-0069.
文摘The reliability and sensitivity analyses of stator blade regulator usually involve complex characteristics like highnonlinearity,multi-failure regions,and small failure probability,which brings in unacceptable computing efficiency and accuracy of the current analysismethods.In this case,by fitting the implicit limit state function(LSF)with active Kriging(AK)model and reducing candidate sample poolwith adaptive importance sampling(AIS),a novel AK-AIS method is proposed.Herein,theAKmodel andMarkov chainMonte Carlo(MCMC)are first established to identify the most probable failure region(s)(MPFRs),and the adaptive kernel density estimation(AKDE)importance sampling function is constructed to select the candidate samples.With the best samples sequentially attained in the reduced candidate samples and employed to update the Kriging-fitted LSF,the failure probability and sensitivity indices are acquired at a lower cost.The proposed method is verified by twomulti-failure numerical examples,and then applied to the reliability and sensitivity analyses of a typical stator blade regulator.Withmethods comparison,the proposed AK-AIS is proven to hold the computing advantages on accuracy and efficiency in complex reliability and sensitivity analysis problems.
基金supported by the National Natural Science Foundation of China(51705221)the China Scholarship Council(201606830028)+1 种基金the Fundamental Research Funds for the Central Universities(NS2015072)the Funding of Jiangsu Innovation Program for Graduate Education(KYLX15 0313)
文摘Most of the maintenance optimization models in condition-based maintenance(CBM) consider the cost-optimal criterion, but few papers have dealt with availability maximization for maintenance applications. A novel optimal Bayesian control approach is presented for maintenance decision making. The system deterioration evolves as a three-state continuous time hidden semi-Markov process. Considering the optimal maintenance policy, the multivariate Bayesian control scheme based on the hidden semi-Markov model(HSMM) is developed, the objective is to maximize the long-run expected average availability per unit time. The proposed approach can optimize the sampling interval and control limit jointly. A case study using Markov chain Monte Carlo(MCMC)simulation is provided and a comparison with the Bayesian control scheme based on hidden Markov model(HMM), the age-based replacement policy, Hotelling’s T2, multivariate exponentially weihted moving average(MEWMA) and multivariate cumulative sum(MCUSUM) control charts is given, which illustrates the effectiveness of the proposed method.
文摘This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.
文摘Based on the observation of importance sampling and second order information about the failure surface of a structure, an importance sampling region is defined in V-space which is obtained by rotating a U-space at the point of maximum likelihood. The sampling region is a hyper-ellipsoid that consists of the sampling ellipse on each plane of main curvature in V-space. Thus, the sampling probability density function can be constructed by the sampling region center and ellipsoid axes. Several examples have shown the efficiency and generality of this method.
基金supported by the basic research program of Natural Science in Shannxi province of China (2021JQ-369)。
文摘In this paper, an importance sampling maximum likelihood(ISML) estimator for direction-of-arrival(DOA) of incoherently distributed(ID) sources is proposed. Starting from the maximum likelihood estimation description of the uniform linear array(ULA), a decoupled concentrated likelihood function(CLF) is presented. A new objective function based on CLF which can obtain a closed-form solution of global maximum is constructed according to Pincus theorem. To obtain the optimal value of the objective function which is a complex high-dimensional integral,we propose an importance sampling approach based on Monte Carlo random calculation. Next, an importance function is derived, which can simplify the problem of generating random vector from a high-dimensional probability density function(PDF) to generate random variable from a one-dimensional PDF. Compared with the existing maximum likelihood(ML) algorithms for DOA estimation of ID sources, the proposed algorithm does not require initial estimates, and its performance is closer to CramerRao lower bound(CRLB). The proposed algorithm performs better than the existing methods when the interval between sources to be estimated is small and in low signal to noise ratio(SNR)scenarios.
文摘Although it is known that exact sampling algorithm is easy to construct and less sensitive to noise, the samples distri- bution of the algorithm deviates from the target states distribution due to the local dependent coupling problem. A new algorithm, named exact sampling with directional threshold (ES-DT) is intro- duced. The main advantage of the new algorithm, in comparison with the traditional exact sampling algorithm, is that it can control the sampling with a rejection strategy in Markov chain during the path growth, and closely approach the ideal distribution based on maintaining the target density. Simulation experiments show the effectiveness of the proposed algorithm.
基金supported by the Hundred Talents Program of the Chinese Academy of Sciences(No.29Y127D11)the National Natural Science Foundation of China(No.41271524)+1 种基金Natural Science Foundation of Gansu Province(No.1210RJDA015)Forestry Industry Research Special Funds for Public Welfare Projects(No.201104009-08)
文摘Spatial heterogeneity is an inherent characteristic of natural forest landscapes, therefore estimation of structural variability, including the collection and analyzing of field measurements, is a growing challenge for monitoring wildlife habitat di- versity and ecosystem sustainability. In this study, we investigated the combined influence of plot shape and size on the accuracy of assessment of conventional and rare structural features in two young-growth spruce-dominated forests in northwestern China. We used a series of inventory schemes and analytical approaches. Our data showed that options for sampling protocols, especially the selection of plot size considered in structural attributes measurement, dramatically af- fect the minimum number of plots required to meet a certain accuracy criteria. The degree of influence of plot shape is related to survey objectives; thus, effects of plot shape differ for evaluations of the "mean" or "representative" stand structural conditions from that for the range of habitat (in extreme values). Results of Monte Carlo simulations suggested that plot sizes 〈0.1 ha could be the most efficient way to sample for conventional characteristics (features with relative constancy within a site, such as stem density). Also, 0.25 ha or even larger plots may have a greater likelihood of capturing rare structural attributes (features possessing high randomness and spatial heterogeneity, such as volume of coarse woody debris) in our forest type. These findings have important implications for advisable sampling protocol (plot size and shape) to adequately capture information on forest habitat structure and diversity; such efforts must be based on a clear definition of which types are structural attributes to measure.