Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach wa...Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach was given. The randomized algorithms here were based on a property from statistical learning theory known as (uniform) convergence of empirical means (UCEM). It is argued that in order to assess the performance of a controller as the plant varies over a pre-specified family, it is better to use the average performance of the controller as the objective function to be optimized, rather than its worst-case performance. The approach is illustrated to be efficient through an example.展开更多
Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face ...Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.展开更多
Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous r...Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.展开更多
Feature selection is a crucial problem in efficient machine learning,and it also greatly contributes to the explainability of machine-driven decisions.Methods,like decision trees and Least Absolute Shrinkage and Selec...Feature selection is a crucial problem in efficient machine learning,and it also greatly contributes to the explainability of machine-driven decisions.Methods,like decision trees and Least Absolute Shrinkage and Selection Operator(LASSO),can select features during training.However,these embedded approaches can only be applied to a small subset of machine learning models.Wrapper based methods can select features independently from machine learning models but they often suffer from a high computational cost.To enhance their efficiency,many randomized algorithms have been designed.In this paper,we propose automatic breadth searching and attention searching adjustment approaches to further speedup randomized wrapper based feature selection.We conduct theoretical computational complexity analysis and further explain our algorithms’generic parallelizability.We conduct experiments on both synthetic and real datasets with different machine learning base models.Results show that,compared with existing approaches,our proposed techniques can locate a more meaningful set of features with a high efficiency.展开更多
This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transfo...This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transform. The experimental results denote that this algorithm can locate the circular mark of Printed Circuit Board (PCB).展开更多
The random forest algorithm was applied to study the nuclear binding energy and charge radius.The regularized root-mean-square of error(RMSE)was proposed to avoid overfitting during the training of random forest.RMSE ...The random forest algorithm was applied to study the nuclear binding energy and charge radius.The regularized root-mean-square of error(RMSE)was proposed to avoid overfitting during the training of random forest.RMSE for nuclides with Z,N>7 is reduced to 0.816 MeV and 0.0200 fm compared with the six-term liquid drop model and a three-term nuclear charge radius formula,respectively.Specific interest is in the possible(sub)shells among the superheavy region,which is important for searching for new elements and the island of stability.The significance of shell features estimated by the so-called shapely additive explanation method suggests(Z,N)=(92,142)and(98,156)as possible subshells indicated by the binding energy.Because the present observed data is far from the N=184 shell,which is suggested by mean-field investigations,its shell effect is not predicted based on present training.The significance analysis of the nuclear charge radius suggests Z=92 and N=136 as possible subshells.The effect is verified by the shell-corrected nuclear charge radius model.展开更多
Cloud computing involves remote server deployments with public net-work infrastructures that allow clients to access computational resources.Virtual Machines(VMs)are supplied on requests and launched without interacti...Cloud computing involves remote server deployments with public net-work infrastructures that allow clients to access computational resources.Virtual Machines(VMs)are supplied on requests and launched without interactions from service providers.Intruders can target these servers and establish malicious con-nections on VMs for carrying out attacks on other clustered VMs.The existing system has issues with execution time and false-positive rates.Hence,the overall system performance is degraded considerably.The proposed approach is designed to eliminate Cross-VM side attacks and VM escape and hide the server’s position so that the opponent cannot track the target server beyond a certain point.Every request is passed from source to destination via one broadcast domain to confuse the opponent and avoid them from tracking the server’s position.Allocation of SECURITY Resources accepts a safety game in a simple format as input andfinds the best coverage vector for the opponent using a Stackelberg Equilibrium(SSE)technique.A Mixed Integer Linear Programming(MILP)framework is used in the algorithm.The VM challenge is reduced by afirewall-based controlling mechanism combining behavior-based detection and signature-based virus detection.The pro-posed method is focused on detecting malware attacks effectively and providing better security for the VMs.Finally,the experimental results indicate that the pro-posed security method is efficient.It consumes minimum execution time,better false positive rate,accuracy,and memory usage than the conventional approach.展开更多
Given the challenge of estimating or calculating quantities of waste electrical and electronic equipment(WEEE)in developing countries,this article focuses on predicting the WEEE generated by Cameroonian small and medi...Given the challenge of estimating or calculating quantities of waste electrical and electronic equipment(WEEE)in developing countries,this article focuses on predicting the WEEE generated by Cameroonian small and medium enterprises(SMEs)that are engaged in ISO 14001:2015 initiatives and consume electrical and electronic equipment(EEE)to enhance their performance and profitability.The methodology employed an exploratory approach involving the application of general equilibrium theory(GET)to contextualize the study and generate relevant parameters for deploying the random forest regression learning algorithm for predictions.Machine learning was applied to 80%of the samples for training,while simulation was conducted on the remaining 20%of samples based on quantities of EEE utilized over a specific period,utilization rates,repair rates,and average lifespans.The results demonstrate that the model’s predicted values are significantly close to the actual quantities of generated WEEE,and the model’s performance was evaluated using the mean squared error(MSE)and yielding satisfactory results.Based on this model,both companies and stakeholders can set realistic objectives for managing companies’WEEE,fostering sustainable socio-environmental practices.展开更多
In the Internet, a group of replicated servers is commonly used in order to improve the scalability of network service. Anycast service is a new network service that can improve network load distribution and simplify ...In the Internet, a group of replicated servers is commonly used in order to improve the scalability of network service. Anycast service is a new network service that can improve network load distribution and simplify certain applications. In this paper, the authors described a simple anycast service model in the Internet without significant affecting the routing and protocol processing infrastructure that was already in place, and proposed an anycast QoS routing algorithm for this model. The algorithm used randomized method to balance network load and improve its performance. Several new techniques are proposed in the algorithm, first, theminimum hops for each node are used in the algorithm, which are used as metric for computing the probability of possible out links. The metric is pre computed for each node in the network, which can simplify the network complexity and provide the routing process with useful information. Second, randomness is used at the link level and depends dynamically on the routing configuration. This provides great flexibility for the routing process, prevents the routing process from overusing certain fixed routing paths, and adequately balances the delay of the routing path. the authors assess the quality of QoS algorithm in terms of the acceptance ratio on anycast QoS requests, and the simulation results on a variety of network topologies and on various parameters show that the algorithm has good performances and can balance network load effectively.展开更多
The generalized singular value decomposition(GSVD)of two matrices with the same number of columns is a very useful tool in many practical applications.However,the GSVD may suffer from heavy computational time and memo...The generalized singular value decomposition(GSVD)of two matrices with the same number of columns is a very useful tool in many practical applications.However,the GSVD may suffer from heavy computational time and memory requirement when the scale of the matrices is quite large.In this paper,we use random projections to capture the most of the action of the matrices and propose randomized algorithms for computing a low-rank approximation of the GSVD.Serval error bounds of the approximation are also presented for the proposed randomized algorithms.Finally,some experimental results show that the proposed randomized algorithms can achieve a good accuracy with less computational cost and storage requirement.展开更多
Spontaneous combustion of coal increases the temperature in adjoining overburden strata of coal seams and poses a challenge when loading blastholes.This condition,known as hot-hole blasting,is dangerous due to the inc...Spontaneous combustion of coal increases the temperature in adjoining overburden strata of coal seams and poses a challenge when loading blastholes.This condition,known as hot-hole blasting,is dangerous due to the increased possibility of premature explosions in loaded blastholes.Thus,it is crucial to load the blastholes with an appropriate amount of explosives within a short period to avoid premature detonation caused by high temperatures of blastholes.Additionally,it will help achieve the desired fragment size.This study tried to ascertain the most influencial variables of mean fragment size and their optimum values adopted for blasting in a fiery seam.Data on blast design,rock mass,and fragmentation of 100 blasts in fiery seams of a coal mine were collected and used to develop mean fragmentation prediction models using soft computational techniques.The coefficient of determination(R^(2)),root mean square error(RMSE),mean absolute error(MAE),mean square error(MSE),variance account for(VAF)and coefficient of efficiency in percentage(CE)were calculated to validate the results.It indicates that the random forest algorithm(RFA)outperforms the artificial neural network(ANN),response surface method(RSM),and decision tree(DT).The values of R^(2),RMSE,MAE,MSE,VAF,and CE for RFA are 0.94,0.034,0.027,0.001,93.58,and 93.01,respectively.Multiple parametric sensitivity analyses(MPSAs)of the input variables showed that the Schmidt hammer rebound number and spacing-to-burden ratio are the most influencial variables for the blast fragment size.The analysis was finally used to define the best blast design variables to achieve optimum fragment size from blasting.The optimum factor values for RFA of S/B,ld/B and ls/ld are 1.03,1.85 and 0.7,respectively.展开更多
Safety patrol inspection in chemical industrial parks is a complex multi-objective task with multiple degrees of freedom.Traditional pointer instruments with advantages like high reliability and strong adaptability to...Safety patrol inspection in chemical industrial parks is a complex multi-objective task with multiple degrees of freedom.Traditional pointer instruments with advantages like high reliability and strong adaptability to harsh environment,are widely applied in such parks.However,they rely on manual readings which have problems like heavy patrol workload,high labor cost,high false positives/negatives and poor timeliness.To address the above problems,this study proposes a path planning method for robot patrol in chemical industrial parks,where a path optimization model based on improved iterated local search and random variable neighborhood descent(ILS-RVND)algorithm is established by integrating the actual requirements of patrol tasks in chemical industrial parks.Further,the effectiveness of the model and algorithm is verified by taking real park data as an example.The results show that compared with GA and ILS-RVND,the improved algorithm reduces quantification cost by about 24%and saves patrol time by about 36%.Apart from shortening the patrol time of robots,optimizing their patrol path and reducing their maintenance loss,the proposed algorithm also avoids the untimely patrol of robots and enhances the safety factor of equipment.展开更多
Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materia...Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.展开更多
Estimating the volume growth of forest ecosystems accurately is important for understanding carbon sequestration and achieving carbon neutrality goals.However,the key environmental factors affecting volume growth diff...Estimating the volume growth of forest ecosystems accurately is important for understanding carbon sequestration and achieving carbon neutrality goals.However,the key environmental factors affecting volume growth differ across various scales and plant functional types.This study was,therefore,conducted to estimate the volume growth of Larix and Quercus forests based on national-scale forestry inventory data in China and its influencing factors using random forest algorithms.The results showed that the model performances of volume growth in natural forests(R^(2)=0.65 for Larix and 0.66 for Quercus,respectively)were better than those in planted forests(R^(2)=0.44 for Larix and 0.40 for Quercus,respectively).In both natural and planted forests,the stand age showed a strong relative importance for volume growth(8.6%–66.2%),while the edaphic and climatic variables had a limited relative importance(<6.0%).The relationship between stand age and volume growth was unimodal in natural forests and linear increase in planted Quercus forests.And the specific locations(i.e.,altitude and aspect)of sampling plots exhibited high relative importance for volume growth in planted forests(4.1%–18.2%).Altitude positively affected volume growth in planted Larix forests but controlled volume growth negatively in planted Quercus forests.Similarly,the effects of other environmental factors on volume growth also differed in both stand origins(planted versus natural)and plant functional types(Larix versus Quercus).These results highlighted that the stand age was the most important predictor for volume growth and there were diverse effects of environmental factors on volume growth among stand origins and plant functional types.Our findings will provide a good framework for site-specific recommendations regarding the management practices necessary to maintain the volume growth in China's forest ecosystems.展开更多
An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to ...An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to improve the sampling efficiency, the simulated annealing algorithm is adopted to optimize the density center of the importance sampling for each failure mode, and results that the more significant contribution the points make to fuzzy failure probability, the higher occurrence possibility the points are sampled. For the system with multiple fuzzy failure modes, a weighted and mixed importance sampling function is constructed. The contribution of each fuzzy failure mode to the system failure probability is represented by the appropriate factors, and the efficiency of sampling is improved furthermore. The variances and the coefficients of variation are derived for the failure probability estimations. Two examples are introduced to illustrate the rationality of the present method. Comparing with the direct Monte-Carlo method, the improved efficiency and the precision of the method are verified by the examples.展开更多
The performance of central processing units(CPUs)can be enhanced by integrating multiple cores into a single chip.Cpu performance can be improved by allocating the tasks using intelligent strategy.If Small tasks wait ...The performance of central processing units(CPUs)can be enhanced by integrating multiple cores into a single chip.Cpu performance can be improved by allocating the tasks using intelligent strategy.If Small tasks wait for long time or executes for long time,then CPU consumes more power.Thus,the amount of power consumed by CPUs can be reduced without increasing the frequency.Lines are used to connect cores,which are organized together to form a network called network on chips(NOCs).NOCs are mainly used in the design of processors.However,its performance can still be enhanced by reducing power consumption.The main problem lies with task scheduling,which fully utilizes the network.Here,we propose a novel randomfit algorithm for NOCs based on power-aware optimization.In this algorithm,tasks that are under the same application are mapped to the neighborhoods of the same application,whereas tasks belonging to different applications are mapped to the processor cores on the basis of a series of steps.This scheduling process is performed during the run time.Experiment results show that the proposed randomfit algorithm reduces the amount of power consumed and increases system performance based on effective scheduling.展开更多
The quality of hot-rolled steel strip is directly affected by the strip crown.Traditional machine learning models have shown limitations in accurately predicting the strip crown,particularly when dealing with imbalanc...The quality of hot-rolled steel strip is directly affected by the strip crown.Traditional machine learning models have shown limitations in accurately predicting the strip crown,particularly when dealing with imbalanced data.This limitation results in poor production quality and efficiency,leading to increased production costs.Thus,a novel strip crown prediction model that uses the Boruta and extremely randomized trees(Boruta-ERT)algorithms to address this issue was proposed.To improve the accuracy of our model,we utilized the synthetic minority over-sampling technique to balance the imbalance data sets.The Boruta-ERT prediction model was then used to select features and predict the strip crown.With the 2160 mm hot rolling production lines of a steel plant serving as the research object,the experimental results showed that 97.01% of prediction data have an absolute error of less than 8 lm.This level of accuracy met the control requirements for strip crown and demonstrated significant benefits for the improvement in production quality of steel strip.展开更多
In dealing with abrasive waterjet machining(AWJM) simulation,most literatures apply finite element method(FEM) to build pure waterjet models or single abrasive particle erosion models.To overcome the mesh distorti...In dealing with abrasive waterjet machining(AWJM) simulation,most literatures apply finite element method(FEM) to build pure waterjet models or single abrasive particle erosion models.To overcome the mesh distortion caused by large deformation using FEM and to consider the effects of both water and abrasive,the smoothed particle hydrodynamics(SPH) coupled FEM modeling for AWJM simulation is presented,in which the abrasive waterjet is modeled by SPH particles and the target material is modeled by FEM.The two parts interact through contact algorithm.Utilizing this model,abrasive waterjet with high velocity penetrating the target materials is simulated and the mechanism of erosion is depicted.The relationships between the depth of penetration and jet parameters,including water pressure and traverse speed,etc,are analyzed based on the simulation.The simulation results agree well with the existed experimental data.The mixing multi-materials SPH particles,which contain abrasive and water,are adopted by means of the randomized algorithm and material model for the abrasive is presented.The study will not only provide a new powerful tool for the simulation of abrasive waterjet machining,but also be beneficial to understand its cutting mechanism and optimize the operating parameters.展开更多
The generation of good pseudo-random numbers is the base of many important fields in scientific computing, such as randomized algorithms and numerical solution of stochastic differential equations. In this paper, a cl...The generation of good pseudo-random numbers is the base of many important fields in scientific computing, such as randomized algorithms and numerical solution of stochastic differential equations. In this paper, a class of random number generators (RNGs) based on Weyl sequence is proposed. The uniformity of those RNGs is proved theoretically. Statistical and numerical computations show the efficiency of the methods.展开更多
Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality...Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality and feasibility of scientific projects, enhanced methods for managing the process are needed. Design/methodology/approach: To ensure both accuracy and impartiality, we design four criteria, the reviewers'fitness degree, research intensity, academic association, and potential conflict of interest, to express the characteristics of an appropriate peer review expert. We first formalize the expert assignment problem as an optimization problem based on the designed criteria, and then propose a randomized algorithm to solve the expert assignment problem of identifying reviewer adequacy. Findings: Simulation results show that the proposed method is quite accurate and impartial during expert assignment. Research limitations: Although the criteria used in this paper can properly show the characteristics of a good and appropriate peer review expert, more criteria/conditions can be included in the proposed scheme to further enhance accuracy and impartiality of the expert assignment. Practical implications: The proposed method can help project funding agencies (e.g. the National Natural Science Foundation of China) find better experts for project peer review. OriginaUty/value: To the authors' knowledge, this is the first publication that proposes an algorithm that applies an impartial approach to the project review expert assignment process. The simulation results show the effectiveness of the proposed method.展开更多
文摘Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach was given. The randomized algorithms here were based on a property from statistical learning theory known as (uniform) convergence of empirical means (UCEM). It is argued that in order to assess the performance of a controller as the plant varies over a pre-specified family, it is better to use the average performance of the controller as the objective function to be optimized, rather than its worst-case performance. The approach is illustrated to be efficient through an example.
基金financially supported by the National Natural Science Foundation of China(No.52174001)the National Natural Science Foundation of China(No.52004064)+1 种基金the Hainan Province Science and Technology Special Fund “Research on Real-time Intelligent Sensing Technology for Closed-loop Drilling of Oil and Gas Reservoirs in Deepwater Drilling”(ZDYF2023GXJS012)Heilongjiang Provincial Government and Daqing Oilfield's first batch of the scientific and technological key project “Research on the Construction Technology of Gulong Shale Oil Big Data Analysis System”(DQYT-2022-JS-750)。
文摘Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.
基金Under the auspices of National Natural Science Foundation of China(No.52079103)。
文摘Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.
基金supported in part by the National Science Foundation(NSF)(Nos.1447711,1743418,and 1843025)
文摘Feature selection is a crucial problem in efficient machine learning,and it also greatly contributes to the explainability of machine-driven decisions.Methods,like decision trees and Least Absolute Shrinkage and Selection Operator(LASSO),can select features during training.However,these embedded approaches can only be applied to a small subset of machine learning models.Wrapper based methods can select features independently from machine learning models but they often suffer from a high computational cost.To enhance their efficiency,many randomized algorithms have been designed.In this paper,we propose automatic breadth searching and attention searching adjustment approaches to further speedup randomized wrapper based feature selection.We conduct theoretical computational complexity analysis and further explain our algorithms’generic parallelizability.We conduct experiments on both synthetic and real datasets with different machine learning base models.Results show that,compared with existing approaches,our proposed techniques can locate a more meaningful set of features with a high efficiency.
基金supported by Science and Technology Project of Fujian Provincial Department of Education under contract JAT170917Youth Science and Research Foundation of Chengyi College Jimei University under contract C16005.
文摘This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transform. The experimental results denote that this algorithm can locate the circular mark of Printed Circuit Board (PCB).
基金Supported by Basic and Applied Basic Research Project of Guangdong Province(2021B0301030006)。
文摘The random forest algorithm was applied to study the nuclear binding energy and charge radius.The regularized root-mean-square of error(RMSE)was proposed to avoid overfitting during the training of random forest.RMSE for nuclides with Z,N>7 is reduced to 0.816 MeV and 0.0200 fm compared with the six-term liquid drop model and a three-term nuclear charge radius formula,respectively.Specific interest is in the possible(sub)shells among the superheavy region,which is important for searching for new elements and the island of stability.The significance of shell features estimated by the so-called shapely additive explanation method suggests(Z,N)=(92,142)and(98,156)as possible subshells indicated by the binding energy.Because the present observed data is far from the N=184 shell,which is suggested by mean-field investigations,its shell effect is not predicted based on present training.The significance analysis of the nuclear charge radius suggests Z=92 and N=136 as possible subshells.The effect is verified by the shell-corrected nuclear charge radius model.
文摘Cloud computing involves remote server deployments with public net-work infrastructures that allow clients to access computational resources.Virtual Machines(VMs)are supplied on requests and launched without interactions from service providers.Intruders can target these servers and establish malicious con-nections on VMs for carrying out attacks on other clustered VMs.The existing system has issues with execution time and false-positive rates.Hence,the overall system performance is degraded considerably.The proposed approach is designed to eliminate Cross-VM side attacks and VM escape and hide the server’s position so that the opponent cannot track the target server beyond a certain point.Every request is passed from source to destination via one broadcast domain to confuse the opponent and avoid them from tracking the server’s position.Allocation of SECURITY Resources accepts a safety game in a simple format as input andfinds the best coverage vector for the opponent using a Stackelberg Equilibrium(SSE)technique.A Mixed Integer Linear Programming(MILP)framework is used in the algorithm.The VM challenge is reduced by afirewall-based controlling mechanism combining behavior-based detection and signature-based virus detection.The pro-posed method is focused on detecting malware attacks effectively and providing better security for the VMs.Finally,the experimental results indicate that the pro-posed security method is efficient.It consumes minimum execution time,better false positive rate,accuracy,and memory usage than the conventional approach.
文摘Given the challenge of estimating or calculating quantities of waste electrical and electronic equipment(WEEE)in developing countries,this article focuses on predicting the WEEE generated by Cameroonian small and medium enterprises(SMEs)that are engaged in ISO 14001:2015 initiatives and consume electrical and electronic equipment(EEE)to enhance their performance and profitability.The methodology employed an exploratory approach involving the application of general equilibrium theory(GET)to contextualize the study and generate relevant parameters for deploying the random forest regression learning algorithm for predictions.Machine learning was applied to 80%of the samples for training,while simulation was conducted on the remaining 20%of samples based on quantities of EEE utilized over a specific period,utilization rates,repair rates,and average lifespans.The results demonstrate that the model’s predicted values are significantly close to the actual quantities of generated WEEE,and the model’s performance was evaluated using the mean squared error(MSE)and yielding satisfactory results.Based on this model,both companies and stakeholders can set realistic objectives for managing companies’WEEE,fostering sustainable socio-environmental practices.
基金TheNationalScienceFundforOverseasDistinguishedYoungScholars (No .6 992 82 0 1)FoundationforUniversityKeyTeacherbytheMinist
文摘In the Internet, a group of replicated servers is commonly used in order to improve the scalability of network service. Anycast service is a new network service that can improve network load distribution and simplify certain applications. In this paper, the authors described a simple anycast service model in the Internet without significant affecting the routing and protocol processing infrastructure that was already in place, and proposed an anycast QoS routing algorithm for this model. The algorithm used randomized method to balance network load and improve its performance. Several new techniques are proposed in the algorithm, first, theminimum hops for each node are used in the algorithm, which are used as metric for computing the probability of possible out links. The metric is pre computed for each node in the network, which can simplify the network complexity and provide the routing process with useful information. Second, randomness is used at the link level and depends dynamically on the routing configuration. This provides great flexibility for the routing process, prevents the routing process from overusing certain fixed routing paths, and adequately balances the delay of the routing path. the authors assess the quality of QoS algorithm in terms of the acceptance ratio on anycast QoS requests, and the simulation results on a variety of network topologies and on various parameters show that the algorithm has good performances and can balance network load effectively.
基金The research is supported by the National Natural Science Foundation of China under Grant nos.11701409 and 11571171the Natural Science Foundation of Jiangsu Province of China under Grant BK20170591the Natural Science Foundation of Jiangsu Higher Education Institutions of China under Grant 17KJB110018.
文摘The generalized singular value decomposition(GSVD)of two matrices with the same number of columns is a very useful tool in many practical applications.However,the GSVD may suffer from heavy computational time and memory requirement when the scale of the matrices is quite large.In this paper,we use random projections to capture the most of the action of the matrices and propose randomized algorithms for computing a low-rank approximation of the GSVD.Serval error bounds of the approximation are also presented for the proposed randomized algorithms.Finally,some experimental results show that the proposed randomized algorithms can achieve a good accuracy with less computational cost and storage requirement.
文摘Spontaneous combustion of coal increases the temperature in adjoining overburden strata of coal seams and poses a challenge when loading blastholes.This condition,known as hot-hole blasting,is dangerous due to the increased possibility of premature explosions in loaded blastholes.Thus,it is crucial to load the blastholes with an appropriate amount of explosives within a short period to avoid premature detonation caused by high temperatures of blastholes.Additionally,it will help achieve the desired fragment size.This study tried to ascertain the most influencial variables of mean fragment size and their optimum values adopted for blasting in a fiery seam.Data on blast design,rock mass,and fragmentation of 100 blasts in fiery seams of a coal mine were collected and used to develop mean fragmentation prediction models using soft computational techniques.The coefficient of determination(R^(2)),root mean square error(RMSE),mean absolute error(MAE),mean square error(MSE),variance account for(VAF)and coefficient of efficiency in percentage(CE)were calculated to validate the results.It indicates that the random forest algorithm(RFA)outperforms the artificial neural network(ANN),response surface method(RSM),and decision tree(DT).The values of R^(2),RMSE,MAE,MSE,VAF,and CE for RFA are 0.94,0.034,0.027,0.001,93.58,and 93.01,respectively.Multiple parametric sensitivity analyses(MPSAs)of the input variables showed that the Schmidt hammer rebound number and spacing-to-burden ratio are the most influencial variables for the blast fragment size.The analysis was finally used to define the best blast design variables to achieve optimum fragment size from blasting.The optimum factor values for RFA of S/B,ld/B and ls/ld are 1.03,1.85 and 0.7,respectively.
基金the National Key R&D Plan of China(No.2021YFE0105000)the National Natural Science Foundation of China(No.52074213)+1 种基金the Shaanxi Key R&D Plan Project(No.2021SF-472)the Yulin Science and Technology Plan Project(No.CXY-2020-036).
文摘Safety patrol inspection in chemical industrial parks is a complex multi-objective task with multiple degrees of freedom.Traditional pointer instruments with advantages like high reliability and strong adaptability to harsh environment,are widely applied in such parks.However,they rely on manual readings which have problems like heavy patrol workload,high labor cost,high false positives/negatives and poor timeliness.To address the above problems,this study proposes a path planning method for robot patrol in chemical industrial parks,where a path optimization model based on improved iterated local search and random variable neighborhood descent(ILS-RVND)algorithm is established by integrating the actual requirements of patrol tasks in chemical industrial parks.Further,the effectiveness of the model and algorithm is verified by taking real park data as an example.The results show that compared with GA and ILS-RVND,the improved algorithm reduces quantification cost by about 24%and saves patrol time by about 36%.Apart from shortening the patrol time of robots,optimizing their patrol path and reducing their maintenance loss,the proposed algorithm also avoids the untimely patrol of robots and enhances the safety factor of equipment.
基金funded by the National Natural Science Foundation of China(42071014).
文摘Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.
基金supported by the Major Program of the National Natural Science Foundation of China(No.32192434)the Fundamental Research Funds of Chinese Academy of Forestry(No.CAFYBB2019ZD001)the National Key Research and Development Program of China(2016YFD060020602).
文摘Estimating the volume growth of forest ecosystems accurately is important for understanding carbon sequestration and achieving carbon neutrality goals.However,the key environmental factors affecting volume growth differ across various scales and plant functional types.This study was,therefore,conducted to estimate the volume growth of Larix and Quercus forests based on national-scale forestry inventory data in China and its influencing factors using random forest algorithms.The results showed that the model performances of volume growth in natural forests(R^(2)=0.65 for Larix and 0.66 for Quercus,respectively)were better than those in planted forests(R^(2)=0.44 for Larix and 0.40 for Quercus,respectively).In both natural and planted forests,the stand age showed a strong relative importance for volume growth(8.6%–66.2%),while the edaphic and climatic variables had a limited relative importance(<6.0%).The relationship between stand age and volume growth was unimodal in natural forests and linear increase in planted Quercus forests.And the specific locations(i.e.,altitude and aspect)of sampling plots exhibited high relative importance for volume growth in planted forests(4.1%–18.2%).Altitude positively affected volume growth in planted Larix forests but controlled volume growth negatively in planted Quercus forests.Similarly,the effects of other environmental factors on volume growth also differed in both stand origins(planted versus natural)and plant functional types(Larix versus Quercus).These results highlighted that the stand age was the most important predictor for volume growth and there were diverse effects of environmental factors on volume growth among stand origins and plant functional types.Our findings will provide a good framework for site-specific recommendations regarding the management practices necessary to maintain the volume growth in China's forest ecosystems.
基金This project is supported by National Natural Science Foundation of China (No.10572117)Aerospace Science Foundation of China(No.N3CH0502,No.N5CH0001)Provincial Natural Science Foundation of Shanxi, China(No.N3CS0501).
文摘An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to improve the sampling efficiency, the simulated annealing algorithm is adopted to optimize the density center of the importance sampling for each failure mode, and results that the more significant contribution the points make to fuzzy failure probability, the higher occurrence possibility the points are sampled. For the system with multiple fuzzy failure modes, a weighted and mixed importance sampling function is constructed. The contribution of each fuzzy failure mode to the system failure probability is represented by the appropriate factors, and the efficiency of sampling is improved furthermore. The variances and the coefficients of variation are derived for the failure probability estimations. Two examples are introduced to illustrate the rationality of the present method. Comparing with the direct Monte-Carlo method, the improved efficiency and the precision of the method are verified by the examples.
文摘The performance of central processing units(CPUs)can be enhanced by integrating multiple cores into a single chip.Cpu performance can be improved by allocating the tasks using intelligent strategy.If Small tasks wait for long time or executes for long time,then CPU consumes more power.Thus,the amount of power consumed by CPUs can be reduced without increasing the frequency.Lines are used to connect cores,which are organized together to form a network called network on chips(NOCs).NOCs are mainly used in the design of processors.However,its performance can still be enhanced by reducing power consumption.The main problem lies with task scheduling,which fully utilizes the network.Here,we propose a novel randomfit algorithm for NOCs based on power-aware optimization.In this algorithm,tasks that are under the same application are mapped to the neighborhoods of the same application,whereas tasks belonging to different applications are mapped to the processor cores on the basis of a series of steps.This scheduling process is performed during the run time.Experiment results show that the proposed randomfit algorithm reduces the amount of power consumed and increases system performance based on effective scheduling.
基金supported by the National Natural Science Foundation of China(Grant Nos.52074085,U21A20117 and U21A20475)the Fundamental Research Funds for the Central Universities(Grant No.N2004010)the Liaoning Revitalization Talents Program(XLYC1907065).
文摘The quality of hot-rolled steel strip is directly affected by the strip crown.Traditional machine learning models have shown limitations in accurately predicting the strip crown,particularly when dealing with imbalanced data.This limitation results in poor production quality and efficiency,leading to increased production costs.Thus,a novel strip crown prediction model that uses the Boruta and extremely randomized trees(Boruta-ERT)algorithms to address this issue was proposed.To improve the accuracy of our model,we utilized the synthetic minority over-sampling technique to balance the imbalance data sets.The Boruta-ERT prediction model was then used to select features and predict the strip crown.With the 2160 mm hot rolling production lines of a steel plant serving as the research object,the experimental results showed that 97.01% of prediction data have an absolute error of less than 8 lm.This level of accuracy met the control requirements for strip crown and demonstrated significant benefits for the improvement in production quality of steel strip.
基金supported by Shandong Provincial Natural Science Foundation of China (Grant No. Y2007A07)
文摘In dealing with abrasive waterjet machining(AWJM) simulation,most literatures apply finite element method(FEM) to build pure waterjet models or single abrasive particle erosion models.To overcome the mesh distortion caused by large deformation using FEM and to consider the effects of both water and abrasive,the smoothed particle hydrodynamics(SPH) coupled FEM modeling for AWJM simulation is presented,in which the abrasive waterjet is modeled by SPH particles and the target material is modeled by FEM.The two parts interact through contact algorithm.Utilizing this model,abrasive waterjet with high velocity penetrating the target materials is simulated and the mechanism of erosion is depicted.The relationships between the depth of penetration and jet parameters,including water pressure and traverse speed,etc,are analyzed based on the simulation.The simulation results agree well with the existed experimental data.The mixing multi-materials SPH particles,which contain abrasive and water,are adopted by means of the randomized algorithm and material model for the abrasive is presented.The study will not only provide a new powerful tool for the simulation of abrasive waterjet machining,but also be beneficial to understand its cutting mechanism and optimize the operating parameters.
基金Supported by National Natural Science Foundation of China (19871047)and National Key Basic Research Special Fund(1998020306).
文摘The generation of good pseudo-random numbers is the base of many important fields in scientific computing, such as randomized algorithms and numerical solution of stochastic differential equations. In this paper, a class of random number generators (RNGs) based on Weyl sequence is proposed. The uniformity of those RNGs is proved theoretically. Statistical and numerical computations show the efficiency of the methods.
基金supported by the National Natural Science Foundation of China under the grant (No.7160325)the Young Talent-Field Frontier Project of Wuhan Documentation and Information Center,Chinese Academy of Sciences
文摘Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality and feasibility of scientific projects, enhanced methods for managing the process are needed. Design/methodology/approach: To ensure both accuracy and impartiality, we design four criteria, the reviewers'fitness degree, research intensity, academic association, and potential conflict of interest, to express the characteristics of an appropriate peer review expert. We first formalize the expert assignment problem as an optimization problem based on the designed criteria, and then propose a randomized algorithm to solve the expert assignment problem of identifying reviewer adequacy. Findings: Simulation results show that the proposed method is quite accurate and impartial during expert assignment. Research limitations: Although the criteria used in this paper can properly show the characteristics of a good and appropriate peer review expert, more criteria/conditions can be included in the proposed scheme to further enhance accuracy and impartiality of the expert assignment. Practical implications: The proposed method can help project funding agencies (e.g. the National Natural Science Foundation of China) find better experts for project peer review. OriginaUty/value: To the authors' knowledge, this is the first publication that proposes an algorithm that applies an impartial approach to the project review expert assignment process. The simulation results show the effectiveness of the proposed method.