In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hy...In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hybrid predictive model named GWO-RF.This model combines the grey wolf optimization(GWO)algorithm with the random forest(RF)technique to predict the D_(80)value,a critical parameter in evaluating rock fragmentation quality.The study is conducted using a dataset from Sarcheshmeh Copper Mine,employing six different swarm sizes for the GWO-RF hybrid model construction.The GWO-RF model’s hyperparameters are systematically optimized within established bounds,and its performance is rigorously evaluated using multiple evaluation metrics.The results show that the GWO-RF hybrid model has higher predictive skills,exceeding traditional models in terms of accuracy.Furthermore,the interpretability of the GWO-RF model is enhanced through the utilization of SHapley Additive exPlanations(SHAP)values.The insights gained from this research contribute to optimizing blasting operations and rock fragmentation outcomes in the mining industry.展开更多
To avoid the aerodynamic performance loss of airfoil at non-design state which often appears in single point design optimization, and to improve the adaptability to the uncertain factors in actual flight environment, ...To avoid the aerodynamic performance loss of airfoil at non-design state which often appears in single point design optimization, and to improve the adaptability to the uncertain factors in actual flight environment, a two-dimensional stochastic airfoil optimization design method based on neural networks is presented. To provide highly efficient and credible analysis, four BP neural networks are built as surrogate models to predict the airfoil aerodynamic coefficients and geometry parameter. These networks are combined with the probability density function obeying normal distribution and the genetic algorithm, thus forming an optimization design method. Using the method, for GA(W)-2 airfoil, a stochastic optimization is implemented in a two-dimensional flight area about Mach number and angle of attack. Compared with original airfoil and single point optimization design airfoil, results show that the two-dimensional stochastic method can improve the performance in a specific flight area, and increase the airfoil adaptability to the stochastic changes of multiple flight parameters.展开更多
Stochastic seismic inversion is the combination of geostatistics and seismic inversion technology which integrates information from seismic records, well logs, and geostatistics into a posterior probability density fu...Stochastic seismic inversion is the combination of geostatistics and seismic inversion technology which integrates information from seismic records, well logs, and geostatistics into a posterior probability density function (PDF) of subsurface models. The Markov chain Monte Carlo (MCMC) method is used to sample the posterior PDF and the subsurface model characteristics can be inferred by analyzing a set of the posterior PDF samples. In this paper, we first introduce the stochastic seismic inversion theory, discuss and analyze the four key parameters: seismic data signal-to-noise ratio (S/N), variogram, the posterior PDF sample number, and well density, and propose the optimum selection of these parameters. The analysis results show that seismic data S/N adjusts the compromise between the influence of the seismic data and geostatistics on the inversion results, the variogram controls the smoothness of the inversion results, the posterior PDF sample number determines the reliability of the statistical characteristics derived from the samples, and well density influences the inversion uncertainty. Finally, the comparison between the stochastic seismic inversion and the deterministic model based seismic inversion indicates that the stochastic seismic inversion can provide more reliable information of the subsurface character.展开更多
Using a modified C D function and stochastic frontier model, the paper analyzed China's cotton yield capacity and found that the yield and technical efficiency of China's cotton planting system can be increas...Using a modified C D function and stochastic frontier model, the paper analyzed China's cotton yield capacity and found that the yield and technical efficiency of China's cotton planting system can be increased by the use of genetically modified (GM) varieties.展开更多
In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are giv...In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.展开更多
Advanced traveler information systems (ATIS) can not only improve drivers' accessibility to the more accurate route travel time information, but also can improve drivers' adaptability to the stochastic network cap...Advanced traveler information systems (ATIS) can not only improve drivers' accessibility to the more accurate route travel time information, but also can improve drivers' adaptability to the stochastic network capacity degradations. In this paper, a mixed stochastic user equilibrium model was proposed to describe the interactive route choice behaviors between ATIS equipped and unequipped drivers on a degradable transport network. In the proposed model the information accessibility of equipped drivers was reflected by lower degree of uncertainty in their stochastic equilibrium flow distributions, and their behavioral adaptability was captured by multiple equilibrium behaviors over the stochastic network state set. The mixed equilibrium model was formulated as a fixed point problem defined in the mixed route flows, and its solution was achieved by executing an iterative algorithm. Numerical experiments were provided to verify the properties of the mixed network equilibrium model and the efficiency of the iterative algorithm.展开更多
For complex chemical processes,process optimization is usually performed on causal models from first principle models.When the mechanism models cannot be obtained easily,restricted model built by process data is used ...For complex chemical processes,process optimization is usually performed on causal models from first principle models.When the mechanism models cannot be obtained easily,restricted model built by process data is used for dynamic process optimization.A new strategy is proposed for complex process optimization,in which latent variables are used as decision variables and statistics is used to describe constraints.As the constraint condition will be more complex by projecting the original variable to latent space,Hotelling T^2 statistics is introduced for constraint formulation in latent space.In this way,the constraint is simplified when the optimization is solved in low-dimensional space of latent variable.The validity of the methodology is illustrated in pH-level optimal control process and practical polypropylene grade transition process.展开更多
We study the characteristics of phase transition to take the top-priority of randomization in the rules of NaSch model (i.e. noise-first model) into account via computing the relaxation time and the order parameter...We study the characteristics of phase transition to take the top-priority of randomization in the rules of NaSch model (i.e. noise-first model) into account via computing the relaxation time and the order parameter. The scaling exponents of the relaxation time and the scaling relation of order parameter, respectively, are obtained.展开更多
The primary aim of clinical trials is to investigate whether a treatment is effective for a particular disease or condition. Randomized controlled clinical trials are considered to be the gold standard for evaluating ...The primary aim of clinical trials is to investigate whether a treatment is effective for a particular disease or condition. Randomized controlled clinical trials are considered to be the gold standard for evaluating the effect of a certain intervention. However, in clinical trials, even after randomization, there are situations where the patients differ substantially with respect to the baseline value of the outcome variable. Many a times the response to interventions depends on the baseline values of the outcome variable. When there are baseline-dependent treatment effects, differences among treatments vary as a function of baseline level. Although variation in outcome associated with baseline value is accounted for in ANCOVA, analysis of individual differences in treatment effect is precluded by the homogeneity of regression assumption. This assumption requires that expected differences in outcome among treatments be constant across all baseline levels. To overcome this difficulty, Weigel and Narvaez [7] proposed a regression model for two treatment groups to analyze individual response to treatments in randomized controlled clinical trials. The authors reviewed the model suggested by Weigel and Narvaez and extended further for three or more treatment groups. The utility of the model was demonstrated with real life data from a randomized controlled clinical trial of bronchial asthma.展开更多
We propose a monomer birth-death model with random removals, in which an aggregate of size k can produce a new monomer at a time-dependent rate I(t)k or lose one monomer at a rate J(t)k, and with a probability P(...We propose a monomer birth-death model with random removals, in which an aggregate of size k can produce a new monomer at a time-dependent rate I(t)k or lose one monomer at a rate J(t)k, and with a probability P(t) an aggregate of any size is randomly removed. We then anedytically investigate the kinetic evolution of the model by means of the rate equation. The results show that the scaling behavior of the aggregate size distribution is dependent crucially on the net birth rate I(t) - J(t) as well as the birth rate I(t). The aggregate size distribution can approach a standard or modified scaling form in some cases, but it may take a scale-free form in other cases. Moreover, the species can survive finally only if either I(t) - J(t) ≥ P(t) or [J(t) + P(t) - I(t)]t ≈ 0 at t ≥ 1; otherwise, it will become extinct.展开更多
New bugs and vulnerabilities are discovered and reported from time to time even after software products are released. One of the common ways to handle these bugs is to patch the software. In this paper, the authors pr...New bugs and vulnerabilities are discovered and reported from time to time even after software products are released. One of the common ways to handle these bugs is to patch the software. In this paper, the authors propose a stochastic model for optimizing the patching time for software bugs and vulnerabilities. The optimal patching time can be computed in the patching script development and operational costs in fix. The authors present two case studies using the Nimda worm vulnerability in Microsoft Internet Information Services web server and the bug report of the Debian project. These studies indicate that the patch applications are later than their optimal fix time.展开更多
Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could ...Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could wipe out the capital cushion of the banks. The aim of this paper is to help the banks to forecast the cumulative loss and its volatility. Defaulting amounts are random and defaults occur at random instants of time. A non Markovian time dependent random point process is used to model the cumulative loss. The expected loss and volatility are evaluated analytically. They are functions of probability of default, probability of loss amount, recovery rate and time. Probability of default being the important contributor is evaluated using Hidden Markov modeling. Numerical results obtained validate the model.展开更多
基金Projects(42177164,52474121)supported by the National Science Foundation of ChinaProject(PBSKL2023A12)supported by the State Key Laboratory of Precision Blasting and Hubei Key Laboratory of Blasting Engineering,China。
文摘In the mining industry,precise forecasting of rock fragmentation is critical for optimising blasting processes.In this study,we address the challenge of enhancing rock fragmentation assessment by developing a novel hybrid predictive model named GWO-RF.This model combines the grey wolf optimization(GWO)algorithm with the random forest(RF)technique to predict the D_(80)value,a critical parameter in evaluating rock fragmentation quality.The study is conducted using a dataset from Sarcheshmeh Copper Mine,employing six different swarm sizes for the GWO-RF hybrid model construction.The GWO-RF model’s hyperparameters are systematically optimized within established bounds,and its performance is rigorously evaluated using multiple evaluation metrics.The results show that the GWO-RF hybrid model has higher predictive skills,exceeding traditional models in terms of accuracy.Furthermore,the interpretability of the GWO-RF model is enhanced through the utilization of SHapley Additive exPlanations(SHAP)values.The insights gained from this research contribute to optimizing blasting operations and rock fragmentation outcomes in the mining industry.
文摘To avoid the aerodynamic performance loss of airfoil at non-design state which often appears in single point design optimization, and to improve the adaptability to the uncertain factors in actual flight environment, a two-dimensional stochastic airfoil optimization design method based on neural networks is presented. To provide highly efficient and credible analysis, four BP neural networks are built as surrogate models to predict the airfoil aerodynamic coefficients and geometry parameter. These networks are combined with the probability density function obeying normal distribution and the genetic algorithm, thus forming an optimization design method. Using the method, for GA(W)-2 airfoil, a stochastic optimization is implemented in a two-dimensional flight area about Mach number and angle of attack. Compared with original airfoil and single point optimization design airfoil, results show that the two-dimensional stochastic method can improve the performance in a specific flight area, and increase the airfoil adaptability to the stochastic changes of multiple flight parameters.
基金supported by the National Major Science and Technology Project of China on Development of Big Oil-Gas Fields and Coalbed Methane (No. 2008ZX05010-002)
文摘Stochastic seismic inversion is the combination of geostatistics and seismic inversion technology which integrates information from seismic records, well logs, and geostatistics into a posterior probability density function (PDF) of subsurface models. The Markov chain Monte Carlo (MCMC) method is used to sample the posterior PDF and the subsurface model characteristics can be inferred by analyzing a set of the posterior PDF samples. In this paper, we first introduce the stochastic seismic inversion theory, discuss and analyze the four key parameters: seismic data signal-to-noise ratio (S/N), variogram, the posterior PDF sample number, and well density, and propose the optimum selection of these parameters. The analysis results show that seismic data S/N adjusts the compromise between the influence of the seismic data and geostatistics on the inversion results, the variogram controls the smoothness of the inversion results, the posterior PDF sample number determines the reliability of the statistical characteristics derived from the samples, and well density influences the inversion uncertainty. Finally, the comparison between the stochastic seismic inversion and the deterministic model based seismic inversion indicates that the stochastic seismic inversion can provide more reliable information of the subsurface character.
文摘Using a modified C D function and stochastic frontier model, the paper analyzed China's cotton yield capacity and found that the yield and technical efficiency of China's cotton planting system can be increased by the use of genetically modified (GM) varieties.
文摘In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.
基金Projects(51378119,51578150)supported by the National Natural Science Foundation of China
文摘Advanced traveler information systems (ATIS) can not only improve drivers' accessibility to the more accurate route travel time information, but also can improve drivers' adaptability to the stochastic network capacity degradations. In this paper, a mixed stochastic user equilibrium model was proposed to describe the interactive route choice behaviors between ATIS equipped and unequipped drivers on a degradable transport network. In the proposed model the information accessibility of equipped drivers was reflected by lower degree of uncertainty in their stochastic equilibrium flow distributions, and their behavioral adaptability was captured by multiple equilibrium behaviors over the stochastic network state set. The mixed equilibrium model was formulated as a fixed point problem defined in the mixed route flows, and its solution was achieved by executing an iterative algorithm. Numerical experiments were provided to verify the properties of the mixed network equilibrium model and the efficiency of the iterative algorithm.
基金Supported by the National Natural Science Foundation of China(61174114)the Research Fund for the Doctoral Program of Higher Education in China(20120101130016)+1 种基金the Natural Science Foundation of Zhejiang Province(LQ15F030006)the Educational Commission Research Program of Zhejiang Province(Y201431412)
文摘For complex chemical processes,process optimization is usually performed on causal models from first principle models.When the mechanism models cannot be obtained easily,restricted model built by process data is used for dynamic process optimization.A new strategy is proposed for complex process optimization,in which latent variables are used as decision variables and statistics is used to describe constraints.As the constraint condition will be more complex by projecting the original variable to latent space,Hotelling T^2 statistics is introduced for constraint formulation in latent space.In this way,the constraint is simplified when the optimization is solved in low-dimensional space of latent variable.The validity of the methodology is illustrated in pH-level optimal control process and practical polypropylene grade transition process.
基金The project supported by National Natural Science Foundation of China under Grant Nos. 10362001 and 10532060 and the Natural Science Foundation of Guangxi Zhuang Autonomous Region under Grant Nos. 0342012 and 0640003
文摘We study the characteristics of phase transition to take the top-priority of randomization in the rules of NaSch model (i.e. noise-first model) into account via computing the relaxation time and the order parameter. The scaling exponents of the relaxation time and the scaling relation of order parameter, respectively, are obtained.
文摘The primary aim of clinical trials is to investigate whether a treatment is effective for a particular disease or condition. Randomized controlled clinical trials are considered to be the gold standard for evaluating the effect of a certain intervention. However, in clinical trials, even after randomization, there are situations where the patients differ substantially with respect to the baseline value of the outcome variable. Many a times the response to interventions depends on the baseline values of the outcome variable. When there are baseline-dependent treatment effects, differences among treatments vary as a function of baseline level. Although variation in outcome associated with baseline value is accounted for in ANCOVA, analysis of individual differences in treatment effect is precluded by the homogeneity of regression assumption. This assumption requires that expected differences in outcome among treatments be constant across all baseline levels. To overcome this difficulty, Weigel and Narvaez [7] proposed a regression model for two treatment groups to analyze individual response to treatments in randomized controlled clinical trials. The authors reviewed the model suggested by Weigel and Narvaez and extended further for three or more treatment groups. The utility of the model was demonstrated with real life data from a randomized controlled clinical trial of bronchial asthma.
基金supported by National Natural Science Foundation of China under Grant Nos. 10775104 and 10305009
文摘We propose a monomer birth-death model with random removals, in which an aggregate of size k can produce a new monomer at a time-dependent rate I(t)k or lose one monomer at a rate J(t)k, and with a probability P(t) an aggregate of any size is randomly removed. We then anedytically investigate the kinetic evolution of the model by means of the rate equation. The results show that the scaling behavior of the aggregate size distribution is dependent crucially on the net birth rate I(t) - J(t) as well as the birth rate I(t). The aggregate size distribution can approach a standard or modified scaling form in some cases, but it may take a scale-free form in other cases. Moreover, the species can survive finally only if either I(t) - J(t) ≥ P(t) or [J(t) + P(t) - I(t)]t ≈ 0 at t ≥ 1; otherwise, it will become extinct.
文摘New bugs and vulnerabilities are discovered and reported from time to time even after software products are released. One of the common ways to handle these bugs is to patch the software. In this paper, the authors propose a stochastic model for optimizing the patching time for software bugs and vulnerabilities. The optimal patching time can be computed in the patching script development and operational costs in fix. The authors present two case studies using the Nimda worm vulnerability in Microsoft Internet Information Services web server and the bug report of the Debian project. These studies indicate that the patch applications are later than their optimal fix time.
文摘Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could wipe out the capital cushion of the banks. The aim of this paper is to help the banks to forecast the cumulative loss and its volatility. Defaulting amounts are random and defaults occur at random instants of time. A non Markovian time dependent random point process is used to model the cumulative loss. The expected loss and volatility are evaluated analytically. They are functions of probability of default, probability of loss amount, recovery rate and time. Probability of default being the important contributor is evaluated using Hidden Markov modeling. Numerical results obtained validate the model.