Since missiles are main threat against aircrafts in air war,a model is proposed for calculating the aircraft survivability to a missile.The hit characteristic of aircraft to a missile is analyzed,and then Monte Carlo ...Since missiles are main threat against aircrafts in air war,a model is proposed for calculating the aircraft survivability to a missile.The hit characteristic of aircraft to a missile is analyzed,and then Monte Carlo method is applied to generate missile detonation location according to its distribution rule.In addition,based on the analysis of fragment trajectory and critical components,the intersection point of these two is determined.Then the kill probability of critical component to a fragment can be calculated,and the aircraft survivability to a missile is obtained accordingly.Finally,the feasibility of the proposed method is demonstrated.Simulation results show that this method captures the basic effects of missile detonation locations on aircraft survivability,which may provide an effective reference to aircraft survivability research.展开更多
The method of mathematical model and further computer simulation is an effective way to the theoretical study of emulsion polymerization and the scale-up of the reactors. In this work, Monte Carlo method has been used...The method of mathematical model and further computer simulation is an effective way to the theoretical study of emulsion polymerization and the scale-up of the reactors. In this work, Monte Carlo method has been used to simulate the nucleation of emulsion polymerization. The effects of emulsifier concentration [S] and initiator concentration [I] on various parameters such as the number of the particles (N p), the average diameter of the latex particles (D p), monomer conversion (x) and average radical number per particle (n) have been studied. The quantitative equations between [S], [I] and N p are in accord absolutely with the classical theory of Smith-Ewart.展开更多
This paper present a simulation study of an evolutionary algorithms, Particle Swarm Optimization PSO algorithm to optimize likelihood function of ARMA(1, 1) model, where maximizing likelihood function is equivalent ...This paper present a simulation study of an evolutionary algorithms, Particle Swarm Optimization PSO algorithm to optimize likelihood function of ARMA(1, 1) model, where maximizing likelihood function is equivalent to maximizing its logarithm, so the objective function 'obj.fun' is maximizing log-likelihood function. Monte Carlo method adapted for implementing and designing the experiments of this simulation. This study including a comparison among three versions of PSO algorithm “Constriction coefficient CCPSO, Inertia weight IWPSO, and Fully Informed FIPSO”, the experiments designed by setting different values of model parameters al, bs sample size n, moreover the parameters of PSO algorithms. MSE used as test statistic to measure the efficiency PSO to estimate model. The results show the ability of PSO to estimate ARMA' s parameters, and the minimum values of MSE getting for COPSO.展开更多
Computer simulation with Monte Carlo is an important tool to investigate the function and equilibrium properties of many biological and soft matter materials solvable in solvents.The appropriate treatment of long-rang...Computer simulation with Monte Carlo is an important tool to investigate the function and equilibrium properties of many biological and soft matter materials solvable in solvents.The appropriate treatment of long-range electrostatic interaction is essential for these charged systems,but remains a challenging problem for large-scale simulations.We develop an efficient Barnes-Hut treecode algorithm for electrostatic evaluation in Monte Carlo simulations of Coulomb many-body systems.The algorithm is based on a divide-and-conquer strategy and fast update of the octree data structure in each trial move through a local adjustment procedure.We test the accuracy of the tree algorithm,and use it to perform computer simulations of electric double layer near a spherical interface.It is shown that the computational cost of the Monte Carlo method with treecode acceleration scales as log N in each move.For a typical system with ten thousand particles,by using the new algorithm,the speed has been improved by two orders of magnitude from the direct summation.展开更多
Two-phase flow in two digital cores is simulated by the color-gradient lattice Boltzmann method.This model can be applied totwo-phase flow with high-density ratio(on order of 1000).The first digital core is an artific...Two-phase flow in two digital cores is simulated by the color-gradient lattice Boltzmann method.This model can be applied totwo-phase flow with high-density ratio(on order of 1000).The first digital core is an artificial sandstone core,and itsthree-dimensional gray model is obtained by Micro-CT scanning.The gray scale images are segmented into discrete phases(solid particles and pore space) by the Otsu algorithm.The second one is a digital core of shale,which is reconstructed usingMarkov Chain Monte Carlo method with segmented SEM scanning image as input.The wettability of solid wall and relativepermeability of a cylindrical tube are simulated to verify the model.In the simulations of liquid and gas two phase flow in digital cores,density ratios of 100,200,500 and 1000 between liquid and gas are chosen.Based on the gas distribution in the digital core at different times,it is found that the fingering phenomenon is more salient at high density ratio.With the density ratioincreasing,the displacement efficiency decreases.Besides,due to numerous small pores in the shale,the displacement efficiency is over 20% less than that in the artificial sandstone and the difference is even about 30% when density ratio is greaterthan 500.As the density ratio increases,the gas saturation decreases in big pores,and even reaches zero in some small pores orbig pores with small throats.Residual liquid mainly distributes in the small pores and the edge of big pores due to the wettability of liquid.Liquid recovery can be enhanced effectively by decreasing its viscosity.展开更多
Recently, environmental pressures along coasts have increased substantially. Classification of estuaries according to their sus- ceptibility to eutrophication nutrient load is a useful method to determine priority man...Recently, environmental pressures along coasts have increased substantially. Classification of estuaries according to their sus- ceptibility to eutrophication nutrient load is a useful method to determine priority management objects and to enforce control measures. Using historical monitoring data from 2007 to 2012, from 65 estuaries, including 101 estuarine monitoring sections and 260 coastal monitoring stations, a nutrient-driven phytoplankton dynamic model was developed based on the relationship among phytoplankton biomass, Total Nitrogen (TN) load and physical features of estuaries. The ecological filter effect of es- tuaries was quantified by introducing conversion efficiency parameter values into the model. Markov Chain Monte Carlo algo- rithm of Bayesian inference was then employed to estimate parameters in the mode/. The developed model fitted well to the observed chlorophyll, primary production, grazing, and sinking rates. The analysis suggests that an estuary with Q/V (the ratio of river flow to estuarine volume) greater than 2.0 per year and e (conversion efficiency ratio) less than 1.0 g C/g N can be classified as less susceptible to TN load, Q/V between 0.7 to 2.0 per year and e between 1.0 to 3.0 g C/g N as moderately sus- ceptible, and e greater than 3.0 g C/g N as very susceptible. The estuaries with Q/V less than 0.7 per year vary greatly in their susceptibility. The estuaries with high and moderate susceptibility accounted for 67% of all the analyzed estuaries. They have relatively high eutrophication risks and should be the focus of environmental supervision and pollution prevention.展开更多
In recent years, locating total lightning at the VLF/LF band has become one of the most important directions in lightning detection. The Low-frequency E-field Detection Array(LFEDA) consisting of nine fast antennas wa...In recent years, locating total lightning at the VLF/LF band has become one of the most important directions in lightning detection. The Low-frequency E-field Detection Array(LFEDA) consisting of nine fast antennas was developed by the Chinese Academy of Meteorological Sciences in Guangzhou between 2014 and 2015. This paper documents the composition of the LFEDA and a lightning-locating algorithm that applies to the low-frequency electric field radiated by lightning pulse discharge events(LPDEs). Theoretical simulation and objective assessment of the accuracy and detection efficiency of LFEDA have been done using Monte Carlo simulation and artificial triggered lightning experiment, respectively. The former results show that having a station in the network with a comparatively long baseline improves both the horizontal location accuracy in the direction perpendicular to the baseline and the vertical location accuracy along the baseline. The latter results show that detection efficiencies for triggered lightning flashes and return strokes are 100% and 95%, respectively. The average planar location error for return strokes of triggered lightning flashes is 102 m. By locating LPDEs in thunderstorms, we find that LPDEs are consistent with convective regions as indicated by strong reflectivity columns, and present a reasonable distribution in the vertical direction.In addition, the LFEDA can reveal an image of lightning development through mapping the channels of lightning. Based on three-dimensional locations, the vertical propagation speed of the preliminary breakdown and the changing trend of the leader's speed in an intra-cloud and a cloud-to-ground flash are investigated. The research results show that the LFEDA has the capability for three-dimensional location of lightning, which provides a new technique for researching lightning development characteristics and thunderstorm electricity.展开更多
基金Supported by the National High Technology Research and Development Programme of China(No.2009AA04Z406)the National NaturalScience Foundation of China(No.61172083)
文摘Since missiles are main threat against aircrafts in air war,a model is proposed for calculating the aircraft survivability to a missile.The hit characteristic of aircraft to a missile is analyzed,and then Monte Carlo method is applied to generate missile detonation location according to its distribution rule.In addition,based on the analysis of fragment trajectory and critical components,the intersection point of these two is determined.Then the kill probability of critical component to a fragment can be calculated,and the aircraft survivability to a missile is obtained accordingly.Finally,the feasibility of the proposed method is demonstrated.Simulation results show that this method captures the basic effects of missile detonation locations on aircraft survivability,which may provide an effective reference to aircraft survivability research.
文摘The method of mathematical model and further computer simulation is an effective way to the theoretical study of emulsion polymerization and the scale-up of the reactors. In this work, Monte Carlo method has been used to simulate the nucleation of emulsion polymerization. The effects of emulsifier concentration [S] and initiator concentration [I] on various parameters such as the number of the particles (N p), the average diameter of the latex particles (D p), monomer conversion (x) and average radical number per particle (n) have been studied. The quantitative equations between [S], [I] and N p are in accord absolutely with the classical theory of Smith-Ewart.
文摘This paper present a simulation study of an evolutionary algorithms, Particle Swarm Optimization PSO algorithm to optimize likelihood function of ARMA(1, 1) model, where maximizing likelihood function is equivalent to maximizing its logarithm, so the objective function 'obj.fun' is maximizing log-likelihood function. Monte Carlo method adapted for implementing and designing the experiments of this simulation. This study including a comparison among three versions of PSO algorithm “Constriction coefficient CCPSO, Inertia weight IWPSO, and Fully Informed FIPSO”, the experiments designed by setting different values of model parameters al, bs sample size n, moreover the parameters of PSO algorithms. MSE used as test statistic to measure the efficiency PSO to estimate model. The results show the ability of PSO to estimate ARMA' s parameters, and the minimum values of MSE getting for COPSO.
基金supported by National Natural Science Foundation of China (Grant Nos.11101276 and 91130012)the support from the Alexander von Humboldt Foundation for a research stay at the Institute of Compututional Physics,University of Stuttgart
文摘Computer simulation with Monte Carlo is an important tool to investigate the function and equilibrium properties of many biological and soft matter materials solvable in solvents.The appropriate treatment of long-range electrostatic interaction is essential for these charged systems,but remains a challenging problem for large-scale simulations.We develop an efficient Barnes-Hut treecode algorithm for electrostatic evaluation in Monte Carlo simulations of Coulomb many-body systems.The algorithm is based on a divide-and-conquer strategy and fast update of the octree data structure in each trial move through a local adjustment procedure.We test the accuracy of the tree algorithm,and use it to perform computer simulations of electric double layer near a spherical interface.It is shown that the computational cost of the Monte Carlo method with treecode acceleration scales as log N in each move.For a typical system with ten thousand particles,by using the new algorithm,the speed has been improved by two orders of magnitude from the direct summation.
基金supported by the National Natural Science Foundation of China(Grant No.51234007,51404291)Program for Changjiang Scholars and Innovative Research Team in University(Grant No.IRT1294)Introducing Talents of Discipline to Universities(Grant No.B08028)
文摘Two-phase flow in two digital cores is simulated by the color-gradient lattice Boltzmann method.This model can be applied totwo-phase flow with high-density ratio(on order of 1000).The first digital core is an artificial sandstone core,and itsthree-dimensional gray model is obtained by Micro-CT scanning.The gray scale images are segmented into discrete phases(solid particles and pore space) by the Otsu algorithm.The second one is a digital core of shale,which is reconstructed usingMarkov Chain Monte Carlo method with segmented SEM scanning image as input.The wettability of solid wall and relativepermeability of a cylindrical tube are simulated to verify the model.In the simulations of liquid and gas two phase flow in digital cores,density ratios of 100,200,500 and 1000 between liquid and gas are chosen.Based on the gas distribution in the digital core at different times,it is found that the fingering phenomenon is more salient at high density ratio.With the density ratioincreasing,the displacement efficiency decreases.Besides,due to numerous small pores in the shale,the displacement efficiency is over 20% less than that in the artificial sandstone and the difference is even about 30% when density ratio is greaterthan 500.As the density ratio increases,the gas saturation decreases in big pores,and even reaches zero in some small pores orbig pores with small throats.Residual liquid mainly distributes in the small pores and the edge of big pores due to the wettability of liquid.Liquid recovery can be enhanced effectively by decreasing its viscosity.
基金supported by Environmental Protection Public Welfare Project of China(Grant No.201309008)
文摘Recently, environmental pressures along coasts have increased substantially. Classification of estuaries according to their sus- ceptibility to eutrophication nutrient load is a useful method to determine priority management objects and to enforce control measures. Using historical monitoring data from 2007 to 2012, from 65 estuaries, including 101 estuarine monitoring sections and 260 coastal monitoring stations, a nutrient-driven phytoplankton dynamic model was developed based on the relationship among phytoplankton biomass, Total Nitrogen (TN) load and physical features of estuaries. The ecological filter effect of es- tuaries was quantified by introducing conversion efficiency parameter values into the model. Markov Chain Monte Carlo algo- rithm of Bayesian inference was then employed to estimate parameters in the mode/. The developed model fitted well to the observed chlorophyll, primary production, grazing, and sinking rates. The analysis suggests that an estuary with Q/V (the ratio of river flow to estuarine volume) greater than 2.0 per year and e (conversion efficiency ratio) less than 1.0 g C/g N can be classified as less susceptible to TN load, Q/V between 0.7 to 2.0 per year and e between 1.0 to 3.0 g C/g N as moderately sus- ceptible, and e greater than 3.0 g C/g N as very susceptible. The estuaries with Q/V less than 0.7 per year vary greatly in their susceptibility. The estuaries with high and moderate susceptibility accounted for 67% of all the analyzed estuaries. They have relatively high eutrophication risks and should be the focus of environmental supervision and pollution prevention.
基金supported by the National Natural Science Foundation of China(Grant Nos.41675005,91537290&41275008)the Basic Research Fund of Chinese Academy of Meteorological Sciences(Grant Nos.2016Z002&2015Z006)
文摘In recent years, locating total lightning at the VLF/LF band has become one of the most important directions in lightning detection. The Low-frequency E-field Detection Array(LFEDA) consisting of nine fast antennas was developed by the Chinese Academy of Meteorological Sciences in Guangzhou between 2014 and 2015. This paper documents the composition of the LFEDA and a lightning-locating algorithm that applies to the low-frequency electric field radiated by lightning pulse discharge events(LPDEs). Theoretical simulation and objective assessment of the accuracy and detection efficiency of LFEDA have been done using Monte Carlo simulation and artificial triggered lightning experiment, respectively. The former results show that having a station in the network with a comparatively long baseline improves both the horizontal location accuracy in the direction perpendicular to the baseline and the vertical location accuracy along the baseline. The latter results show that detection efficiencies for triggered lightning flashes and return strokes are 100% and 95%, respectively. The average planar location error for return strokes of triggered lightning flashes is 102 m. By locating LPDEs in thunderstorms, we find that LPDEs are consistent with convective regions as indicated by strong reflectivity columns, and present a reasonable distribution in the vertical direction.In addition, the LFEDA can reveal an image of lightning development through mapping the channels of lightning. Based on three-dimensional locations, the vertical propagation speed of the preliminary breakdown and the changing trend of the leader's speed in an intra-cloud and a cloud-to-ground flash are investigated. The research results show that the LFEDA has the capability for three-dimensional location of lightning, which provides a new technique for researching lightning development characteristics and thunderstorm electricity.