Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face ...Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.展开更多
Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous r...Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.展开更多
The sampling problem for input-queued (IQ) randomized scheduling algorithms is analyzed.We observe that if the current scheduling decision is a maximum weighted matching (MWM),the MWM for the next slot mostly falls in...The sampling problem for input-queued (IQ) randomized scheduling algorithms is analyzed.We observe that if the current scheduling decision is a maximum weighted matching (MWM),the MWM for the next slot mostly falls in those matchings whose weight is closed to the current MWM.Using this heuristic,a novel randomized algorithm for IQ scheduling,named genetic algorithm-like scheduling algorithm (GALSA),is proposed.Evolutionary strategy is used for choosing sampling points in GALSA.GALSA works with only O(N) samples which means that GALSA has lower complexity than the famous randomized scheduling algorithm,APSARA.Simulation results show that the delay performance of GALSA is quite competitive with respect to that of APSARA.展开更多
Random vibration control is aimed at reproducing the power spectral density (PSD) at specified control points. The classical frequency-spectrum equalization algorithm needs to compute the average of the multiple fre...Random vibration control is aimed at reproducing the power spectral density (PSD) at specified control points. The classical frequency-spectrum equalization algorithm needs to compute the average of the multiple frequency response functions (FRFs), which lengthens the control loop time in the equalization process. Likewise, the feedback control algorithm has a very slow convergence rate due to the small value of the feedback gain parameter to ensure stability of the system. To overcome these limitations, an adaptive inverse control of random vibrations based on the filtered-X least mean-square (LMS) algorithm is proposed. Furthermore, according to the description and iteration characteristics of random vibration tests in the frequency domain, the frequency domain LMS algorithm is adopted to refine the inverse characteristics of the FRF instead of the traditional time domain LMS algorithm. This inverse characteristic, which is called the impedance function of the system under control, is used to update the drive PSD directly. The test results indicated that in addition to successfully avoiding the instability problem that occurs during the iteration process, the adaptive control strategy minimizes the amount of time needed to obtain a short control loop and achieve equalization.展开更多
This paper proposes an adaptive chaos quantum honey bee algorithm (CQHBA) for solving chance-constrained program- ming in random fuzzy environment based on random fuzzy simulations. Random fuzzy simulation is design...This paper proposes an adaptive chaos quantum honey bee algorithm (CQHBA) for solving chance-constrained program- ming in random fuzzy environment based on random fuzzy simulations. Random fuzzy simulation is designed to estimate the chance of a random fuzzy event and the optimistic value to a random fuzzy variable. In CQHBA, each bee carries a group of quantum bits representing a solution. Chaos optimization searches space around the selected best-so-far food source. In the marriage process, random interferential discrete quantum crossover is done between selected drones and the queen. Gaussian quantum mutation is used to keep the diversity of whole population. New methods of computing quantum rotation angles are designed based on grads. A proof of con- vergence for CQHBA is developed and a theoretical analysis of the computational overhead for the algorithm is presented. Numerical examples are presented to demonstrate its superiority in robustness and stability, efficiency of computational complexity, success rate, and accuracy of solution quality. CQHBA is manifested to be highly robust under various conditions and capable of handling most random fuzzy programmings with any parameter settings, variable initializations, system tolerance and confidence level, perturbations, and noises.展开更多
Precise recovery of CoalbedMethane(CBM)based on transparent reconstruction of geological conditions is a branch of intelligent mining.The process of permeability reconstruction,ranging from data perception to real-tim...Precise recovery of CoalbedMethane(CBM)based on transparent reconstruction of geological conditions is a branch of intelligent mining.The process of permeability reconstruction,ranging from data perception to real-time data visualization,is applicable to disaster risk warning and intelligent decision-making on gas drainage.In this study,a machine learning method integrating the Random Forest(RF)and the Genetic Algorithm(GA)was established for permeability prediction in the Xishan Coalfield based on Uniaxial Compressive Strength(UCS),effective stress,temperature and gas pressure.A total of 50 sets of data collected by a self-developed apparatus were used to generate datasets for training and validating models.Statistical measures including the coefficient of determination(R2)and Root Mean Square Error(RMSE)were selected to validate and compare the predictive performances of the single RF model and the hybrid RF–GA model.Furthermore,sensitivity studies were conducted to evaluate the importance of input parameters.The results show that,the proposed RF–GA model is robust in predicting the permeability;UCS is directly correlated to permeability,while all other inputs are inversely related to permeability;the effective stress exerts the greatest impact on permeability based on importance score,followed by the temperature(or gas pressure)and UCS.The partial dependence plots,indicative of marginal utility of each feature in permeability prediction,are in line with experimental results.Thus,the proposed hybrid model(RF–GA)is capable of predicting permeability and thus beneficial to precise CBMrecovery.展开更多
Anomaly classification based on network traffic features is an important task to monitor and detect network intrusion attacks.Network-based intrusion detection systems(NIDSs)using machine learning(ML)methods are effec...Anomaly classification based on network traffic features is an important task to monitor and detect network intrusion attacks.Network-based intrusion detection systems(NIDSs)using machine learning(ML)methods are effective tools for protecting network infrastructures and services from unpredictable and unseen attacks.Among several ML methods,random forest(RF)is a robust method that can be used in ML-based network intrusion detection solutions.However,the minimum number of instances for each split and the number of trees in the forest are two key parameters of RF that can affect classification accuracy.Therefore,optimal parameter selection is a real problem in RF-based anomaly classification of intrusion detection systems.In this paper,we propose to use the genetic algorithm(GA)for selecting the appropriate values of these two parameters,optimizing the RF classifier and improving the classification accuracy of normal and abnormal network traffics.To validate the proposed GA-based RF model,a number of experiments is conducted on two public datasets and evaluated using a set of performance evaluation measures.In these experiments,the accuracy result is compared with the accuracies of baseline ML classifiers in the recent works.Experimental results reveal that the proposed model can avert the uncertainty in selection the values of RF’s parameters,improving the accuracy of anomaly classification in NIDSs without incurring excessive time.展开更多
In this paper, sixty-eight research articles published between 2000 and 2017 as well as textbooks which employed four classification algorithms: K-Nearest-Neighbor (KNN), Support Vector Machines (SVM), Random Forest (...In this paper, sixty-eight research articles published between 2000 and 2017 as well as textbooks which employed four classification algorithms: K-Nearest-Neighbor (KNN), Support Vector Machines (SVM), Random Forest (RF) and Neural Network (NN) as the main statistical tools were reviewed. The aim was to examine and compare these nonparametric classification methods on the following attributes: robustness to training data, sensitivity to changes, data fitting, stability, ability to handle large data sizes, sensitivity to noise, time invested in parameter tuning, and accuracy. The performances, strengths and shortcomings of each of the algorithms were examined, and finally, a conclusion was arrived at on which one has higher performance. It was evident from the literature reviewed that RF is too sensitive to small changes in the training dataset and is occasionally unstable and tends to overfit in the model. KNN is easy to implement and understand but has a major drawback of becoming significantly slow as the size of the data in use grows, while the ideal value of K for the KNN classifier is difficult to set. SVM and RF are insensitive to noise or overtraining, which shows their ability in dealing with unbalanced data. Larger input datasets will lengthen classification times for NN and KNN more than for SVM and RF. Among these nonparametric classification methods, NN has the potential to become a more widely used classification algorithm, but because of their time-consuming parameter tuning procedure, high level of complexity in computational processing, the numerous types of NN architectures to choose from and the high number of algorithms used for training, most researchers recommend SVM and RF as easier and wieldy used methods which repeatedly achieve results with high accuracies and are often faster to implement.展开更多
To solve the complex weight matrix derivative problem when using the weighted least squares method to estimate the parameters of the mixed additive and multiplicative random error model(MAM error model),we use an impr...To solve the complex weight matrix derivative problem when using the weighted least squares method to estimate the parameters of the mixed additive and multiplicative random error model(MAM error model),we use an improved artificial bee colony algorithm without derivative and the bootstrap method to estimate the parameters and evaluate the accuracy of MAM error model.The improved artificial bee colony algorithm can update individuals in multiple dimensions and improve the cooperation ability between individuals by constructing a new search equation based on the idea of quasi-affine transformation.The experimental results show that based on the weighted least squares criterion,the algorithm can get the results consistent with the weighted least squares method without multiple formula derivation.The parameter estimation and accuracy evaluation method based on the bootstrap method can get better parameter estimation and more reasonable accuracy information than existing methods,which provides a new idea for the theory of parameter estimation and accuracy evaluation of the MAM error model.展开更多
This study investigates the multi-solution search of the optimized quantum random-walk search algorithm on the hypercube. Through generalizing the abstract search algorithm which is a general tool for analyzing the se...This study investigates the multi-solution search of the optimized quantum random-walk search algorithm on the hypercube. Through generalizing the abstract search algorithm which is a general tool for analyzing the search on the graph to the multi-solution case, it can be applied to analyze the multi-solution case of quantum random-walk search on the graph directly. Thus, the computational complexity of the optimized quantum random-walk search algorithm for the multi-solution search is obtained. Through numerical simulations and analysis, we obtain a critical value of the proportion of solutions q. For a given q, we derive the relationship between the success rate of the algorithm and the number of iterations when q is no longer than the critical value.展开更多
Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstructio...Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstruction and evolution of the sampling distributions over the space of candidate solutions. Iterativeconstruction of the sampling distributions is based on the idea of the global random search of generationalmethods. Under this frame, propontional selection is characterized as a gobal search operator, and recombination is characerized as the search process that exploits similarities. It is shown-that by properly constraining the search breadth of recombination operators, weak convergence of evolutionary algorithms to aglobal optimum can be ensured.展开更多
Aiming at the poor location accuracy caused by the harsh and complex underground environment,long strip roadway,limited wireless transmission and sparse anchor nodes,an underground location algorithm based on random f...Aiming at the poor location accuracy caused by the harsh and complex underground environment,long strip roadway,limited wireless transmission and sparse anchor nodes,an underground location algorithm based on random forest and compensation for environmental factors was proposed.Firstly,the underground wireless access point(AP)network model and tunnel environment were analyzed,and the fingerprint location algorithm was built.And then the Received Signal Strength(RSS)was analyzed by Kalman Filter algorithm in the offline sampling and real-time positioning stage.Meanwhile,the target speed constraint condition was introduced to reduce the error caused by environmental factors.The experimental results show that the proposed algorithm solves the problem of insufficient location accuracy and large fluctuation affected by environment when the anchor nodes are sparse.At the same time,the average location accuracy reaches three meters,which can satisfy the application of underground rescue,activity track playback,disaster monitoring and positioning.It has high application value in complex underground environment.展开更多
This paper investigates the effects of decoherence generated by broken-link-type noise in the hypercube on an optimized quantum random-walk search algorithm. When the hypercube occurs with random broken links, the opt...This paper investigates the effects of decoherence generated by broken-link-type noise in the hypercube on an optimized quantum random-walk search algorithm. When the hypercube occurs with random broken links, the optimized quantum random-walk search algorithm with decoherence is depicted through defining the shift operator which includes the possibility of broken links. For a given database size, we obtain the maximum success rate of the algorithm and the required number of iterations through numerical simulations and analysis when the algorithm is in the presence of decoherence. Then the computational complexity of the algorithm with decoherence is obtained. The results show that the ultimate effect of broken-link-type decoherence on the optimized quantum random-walk search algorithm is negative.展开更多
To address the issue of field size in random network coding, we propose an Improved Adaptive Random Convolutional Network Coding (IARCNC) algorithm to considerably reduce the amount of occupied memory. The operation o...To address the issue of field size in random network coding, we propose an Improved Adaptive Random Convolutional Network Coding (IARCNC) algorithm to considerably reduce the amount of occupied memory. The operation of IARCNC is similar to that of Adaptive Random Convolutional Network Coding (ARCNC), with the coefficients of local encoding kernels chosen uniformly at random over a small finite field. The difference is that the length of the local encoding kernels at the nodes used by IARCNC is constrained by the depth; meanwhile, increases until all the related sink nodes can be decoded. This restriction can make the code length distribution more reasonable. Therefore, IARCNC retains the advantages of ARCNC, such as a small decoding delay and partial adaptation to an unknown topology without an early estimation of the field size. In addition, it has its own advantage, that is, a higher reduction in memory use. The simulation and the example show the effectiveness of the proposed algorithm.展开更多
Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach wa...Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach was given. The randomized algorithms here were based on a property from statistical learning theory known as (uniform) convergence of empirical means (UCEM). It is argued that in order to assess the performance of a controller as the plant varies over a pre-specified family, it is better to use the average performance of the controller as the objective function to be optimized, rather than its worst-case performance. The approach is illustrated to be efficient through an example.展开更多
The order of the projection in the algebraic reconstruction technique(ART)method has great influence on the rate of the convergence.Although many scholars have studied the order of the projection,few theoretical proof...The order of the projection in the algebraic reconstruction technique(ART)method has great influence on the rate of the convergence.Although many scholars have studied the order of the projection,few theoretical proofs are given.Thomas Strohmer and Roman Vershynin introduced a randomized version of the Kaczmarz method for consistent,and over-determined linear systems and proved whose rate does not depend on the number of equations in the systems in 2009.In this paper,we apply this method to computed tomography(CT)image reconstruction and compared images generated by the sequential Kaczmarz method and the randomized Kaczmarz method.Experiments demonstrates the feasibility of the randomized Kaczmarz algorithm in CT image reconstruction and its exponential curve convergence.展开更多
The staggered distribution of joints and fissures in space constitutes the weak part of any rock mass.The identification of rock mass structural planes and the extraction of characteristic parameters are the basis of ...The staggered distribution of joints and fissures in space constitutes the weak part of any rock mass.The identification of rock mass structural planes and the extraction of characteristic parameters are the basis of rock-mass integrity evaluation,which is very important for analysis of slope stability.The laser scanning technique can be used to acquire the coordinate information pertaining to each point of the structural plane,but large amount of point cloud data,uneven density distribution,and noise point interference make the identification efficiency and accuracy of different types of structural planes limited by point cloud data analysis technology.A new point cloud identification and segmentation algorithm for rock mass structural surfaces is proposed.Based on the distribution states of the original point cloud in different neighborhoods in space,the point clouds are characterized by multi-dimensional eigenvalues and calculated by the robust randomized Hough transform(RRHT).The normal vector difference and the final eigenvalue are proposed for characteristic distinction,and the identification of rock mass structural surfaces is completed through regional growth,which strengthens the difference expression of point clouds.In addition,nearest Voxel downsampling is also introduced in the RRHT calculation,which further reduces the number of sources of neighborhood noises,thereby improving the accuracy and stability of the calculation.The advantages of the method have been verified by laboratory models.The results showed that the proposed method can better achieve the segmentation and statistics of structural planes with interfaces and sharp boundaries.The method works well in the identification of joints,fissures,and other structural planes on Mangshezhai slope in the Three Gorges Reservoir area,China.It can provide a stable and effective technique for the identification and segmentation of rock mass structural planes,which is beneficial in engineering practice.展开更多
This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transfo...This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transform. The experimental results denote that this algorithm can locate the circular mark of Printed Circuit Board (PCB).展开更多
The traditional collaborative filtering recommendation technology has some shortcomings in the large data environment. To solve this problem, a personalized recommendation method based on cloud computing technology is...The traditional collaborative filtering recommendation technology has some shortcomings in the large data environment. To solve this problem, a personalized recommendation method based on cloud computing technology is proposed. The large data set and recommendation computation are decomposed into parallel processing on multiple computers. A parallel recommendation engine based on Hadoop open source framework is established, and the effectiveness of the system is validated by learning recommendation on an English training platform. The experimental results show that the scalability of the recommender system can be greatly improved by using cloud computing technology to handle massive data in the cluster. On the basis of the comparison of traditional recommendation algorithms, combined with the advantages of cloud computing, a personalized recommendation system based on cloud computing is proposed.展开更多
This paper introduces the principle of genetic algorithm and the basic method of solving Markov random field parameters.Focusing on the shortcomings in present methods,a new method based on genetic algorithms is propo...This paper introduces the principle of genetic algorithm and the basic method of solving Markov random field parameters.Focusing on the shortcomings in present methods,a new method based on genetic algorithms is proposed to solve the parameters in the Markov random field.The detailed procedure is discussed.On the basis of the parameters solved by genetic algorithms,some experiments on classification of aerial images are given.Experimental results show that the proposed method is effective and the classification results are satisfactory.展开更多
基金financially supported by the National Natural Science Foundation of China(No.52174001)the National Natural Science Foundation of China(No.52004064)+1 种基金the Hainan Province Science and Technology Special Fund “Research on Real-time Intelligent Sensing Technology for Closed-loop Drilling of Oil and Gas Reservoirs in Deepwater Drilling”(ZDYF2023GXJS012)Heilongjiang Provincial Government and Daqing Oilfield's first batch of the scientific and technological key project “Research on the Construction Technology of Gulong Shale Oil Big Data Analysis System”(DQYT-2022-JS-750)。
文摘Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.
基金Under the auspices of National Natural Science Foundation of China(No.52079103)。
文摘Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.
文摘The sampling problem for input-queued (IQ) randomized scheduling algorithms is analyzed.We observe that if the current scheduling decision is a maximum weighted matching (MWM),the MWM for the next slot mostly falls in those matchings whose weight is closed to the current MWM.Using this heuristic,a novel randomized algorithm for IQ scheduling,named genetic algorithm-like scheduling algorithm (GALSA),is proposed.Evolutionary strategy is used for choosing sampling points in GALSA.GALSA works with only O(N) samples which means that GALSA has lower complexity than the famous randomized scheduling algorithm,APSARA.Simulation results show that the delay performance of GALSA is quite competitive with respect to that of APSARA.
基金Program for New Century Excellent Talents in Universities Under Grant No.NCET-04-0325
文摘Random vibration control is aimed at reproducing the power spectral density (PSD) at specified control points. The classical frequency-spectrum equalization algorithm needs to compute the average of the multiple frequency response functions (FRFs), which lengthens the control loop time in the equalization process. Likewise, the feedback control algorithm has a very slow convergence rate due to the small value of the feedback gain parameter to ensure stability of the system. To overcome these limitations, an adaptive inverse control of random vibrations based on the filtered-X least mean-square (LMS) algorithm is proposed. Furthermore, according to the description and iteration characteristics of random vibration tests in the frequency domain, the frequency domain LMS algorithm is adopted to refine the inverse characteristics of the FRF instead of the traditional time domain LMS algorithm. This inverse characteristic, which is called the impedance function of the system under control, is used to update the drive PSD directly. The test results indicated that in addition to successfully avoiding the instability problem that occurs during the iteration process, the adaptive control strategy minimizes the amount of time needed to obtain a short control loop and achieve equalization.
基金supported by National High Technology Research and Development Program of China (863 Program) (No. 2007AA041603)National Natural Science Foundation of China (No. 60475035)+2 种基金Key Technologies Research and Development Program Foundation of Hunan Province of China (No. 2007FJ1806)Science and Technology Research Plan of National University of Defense Technology (No. CX07-03-01)Top Class Graduate Student Innovation Sustentation Fund of National University of Defense Technology (No. B070302.)
文摘This paper proposes an adaptive chaos quantum honey bee algorithm (CQHBA) for solving chance-constrained program- ming in random fuzzy environment based on random fuzzy simulations. Random fuzzy simulation is designed to estimate the chance of a random fuzzy event and the optimistic value to a random fuzzy variable. In CQHBA, each bee carries a group of quantum bits representing a solution. Chaos optimization searches space around the selected best-so-far food source. In the marriage process, random interferential discrete quantum crossover is done between selected drones and the queen. Gaussian quantum mutation is used to keep the diversity of whole population. New methods of computing quantum rotation angles are designed based on grads. A proof of con- vergence for CQHBA is developed and a theoretical analysis of the computational overhead for the algorithm is presented. Numerical examples are presented to demonstrate its superiority in robustness and stability, efficiency of computational complexity, success rate, and accuracy of solution quality. CQHBA is manifested to be highly robust under various conditions and capable of handling most random fuzzy programmings with any parameter settings, variable initializations, system tolerance and confidence level, perturbations, and noises.
基金This work has been supported by the Fundamental Research Funds for the Central Universities[2017XKZD06].
文摘Precise recovery of CoalbedMethane(CBM)based on transparent reconstruction of geological conditions is a branch of intelligent mining.The process of permeability reconstruction,ranging from data perception to real-time data visualization,is applicable to disaster risk warning and intelligent decision-making on gas drainage.In this study,a machine learning method integrating the Random Forest(RF)and the Genetic Algorithm(GA)was established for permeability prediction in the Xishan Coalfield based on Uniaxial Compressive Strength(UCS),effective stress,temperature and gas pressure.A total of 50 sets of data collected by a self-developed apparatus were used to generate datasets for training and validating models.Statistical measures including the coefficient of determination(R2)and Root Mean Square Error(RMSE)were selected to validate and compare the predictive performances of the single RF model and the hybrid RF–GA model.Furthermore,sensitivity studies were conducted to evaluate the importance of input parameters.The results show that,the proposed RF–GA model is robust in predicting the permeability;UCS is directly correlated to permeability,while all other inputs are inversely related to permeability;the effective stress exerts the greatest impact on permeability based on importance score,followed by the temperature(or gas pressure)and UCS.The partial dependence plots,indicative of marginal utility of each feature in permeability prediction,are in line with experimental results.Thus,the proposed hybrid model(RF–GA)is capable of predicting permeability and thus beneficial to precise CBMrecovery.
文摘Anomaly classification based on network traffic features is an important task to monitor and detect network intrusion attacks.Network-based intrusion detection systems(NIDSs)using machine learning(ML)methods are effective tools for protecting network infrastructures and services from unpredictable and unseen attacks.Among several ML methods,random forest(RF)is a robust method that can be used in ML-based network intrusion detection solutions.However,the minimum number of instances for each split and the number of trees in the forest are two key parameters of RF that can affect classification accuracy.Therefore,optimal parameter selection is a real problem in RF-based anomaly classification of intrusion detection systems.In this paper,we propose to use the genetic algorithm(GA)for selecting the appropriate values of these two parameters,optimizing the RF classifier and improving the classification accuracy of normal and abnormal network traffics.To validate the proposed GA-based RF model,a number of experiments is conducted on two public datasets and evaluated using a set of performance evaluation measures.In these experiments,the accuracy result is compared with the accuracies of baseline ML classifiers in the recent works.Experimental results reveal that the proposed model can avert the uncertainty in selection the values of RF’s parameters,improving the accuracy of anomaly classification in NIDSs without incurring excessive time.
文摘In this paper, sixty-eight research articles published between 2000 and 2017 as well as textbooks which employed four classification algorithms: K-Nearest-Neighbor (KNN), Support Vector Machines (SVM), Random Forest (RF) and Neural Network (NN) as the main statistical tools were reviewed. The aim was to examine and compare these nonparametric classification methods on the following attributes: robustness to training data, sensitivity to changes, data fitting, stability, ability to handle large data sizes, sensitivity to noise, time invested in parameter tuning, and accuracy. The performances, strengths and shortcomings of each of the algorithms were examined, and finally, a conclusion was arrived at on which one has higher performance. It was evident from the literature reviewed that RF is too sensitive to small changes in the training dataset and is occasionally unstable and tends to overfit in the model. KNN is easy to implement and understand but has a major drawback of becoming significantly slow as the size of the data in use grows, while the ideal value of K for the KNN classifier is difficult to set. SVM and RF are insensitive to noise or overtraining, which shows their ability in dealing with unbalanced data. Larger input datasets will lengthen classification times for NN and KNN more than for SVM and RF. Among these nonparametric classification methods, NN has the potential to become a more widely used classification algorithm, but because of their time-consuming parameter tuning procedure, high level of complexity in computational processing, the numerous types of NN architectures to choose from and the high number of algorithms used for training, most researchers recommend SVM and RF as easier and wieldy used methods which repeatedly achieve results with high accuracies and are often faster to implement.
基金supported by the National Natural Science Foundation of China(No.42174011 and No.41874001).
文摘To solve the complex weight matrix derivative problem when using the weighted least squares method to estimate the parameters of the mixed additive and multiplicative random error model(MAM error model),we use an improved artificial bee colony algorithm without derivative and the bootstrap method to estimate the parameters and evaluate the accuracy of MAM error model.The improved artificial bee colony algorithm can update individuals in multiple dimensions and improve the cooperation ability between individuals by constructing a new search equation based on the idea of quasi-affine transformation.The experimental results show that based on the weighted least squares criterion,the algorithm can get the results consistent with the weighted least squares method without multiple formula derivation.The parameter estimation and accuracy evaluation method based on the bootstrap method can get better parameter estimation and more reasonable accuracy information than existing methods,which provides a new idea for the theory of parameter estimation and accuracy evaluation of the MAM error model.
基金supported by the National Basic Research Program of China(Grant No.2013CB338002)
文摘This study investigates the multi-solution search of the optimized quantum random-walk search algorithm on the hypercube. Through generalizing the abstract search algorithm which is a general tool for analyzing the search on the graph to the multi-solution case, it can be applied to analyze the multi-solution case of quantum random-walk search on the graph directly. Thus, the computational complexity of the optimized quantum random-walk search algorithm for the multi-solution search is obtained. Through numerical simulations and analysis, we obtain a critical value of the proportion of solutions q. For a given q, we derive the relationship between the success rate of the algorithm and the number of iterations when q is no longer than the critical value.
文摘Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstruction and evolution of the sampling distributions over the space of candidate solutions. Iterativeconstruction of the sampling distributions is based on the idea of the global random search of generationalmethods. Under this frame, propontional selection is characterized as a gobal search operator, and recombination is characerized as the search process that exploits similarities. It is shown-that by properly constraining the search breadth of recombination operators, weak convergence of evolutionary algorithms to aglobal optimum can be ensured.
基金The work was supported by Projects of Natural Science Foundational in Higher Education Institutions of Anhui Province(KJ2017A449)Chaohu University’s Innovation and Entrepreneurship Training Program for Provincial College Students in 2019(No.S201910380042)。
文摘Aiming at the poor location accuracy caused by the harsh and complex underground environment,long strip roadway,limited wireless transmission and sparse anchor nodes,an underground location algorithm based on random forest and compensation for environmental factors was proposed.Firstly,the underground wireless access point(AP)network model and tunnel environment were analyzed,and the fingerprint location algorithm was built.And then the Received Signal Strength(RSS)was analyzed by Kalman Filter algorithm in the offline sampling and real-time positioning stage.Meanwhile,the target speed constraint condition was introduced to reduce the error caused by environmental factors.The experimental results show that the proposed algorithm solves the problem of insufficient location accuracy and large fluctuation affected by environment when the anchor nodes are sparse.At the same time,the average location accuracy reaches three meters,which can satisfy the application of underground rescue,activity track playback,disaster monitoring and positioning.It has high application value in complex underground environment.
基金supported by the National Basic Research Program of China(Grant No.2013CB338002)
文摘This paper investigates the effects of decoherence generated by broken-link-type noise in the hypercube on an optimized quantum random-walk search algorithm. When the hypercube occurs with random broken links, the optimized quantum random-walk search algorithm with decoherence is depicted through defining the shift operator which includes the possibility of broken links. For a given database size, we obtain the maximum success rate of the algorithm and the required number of iterations through numerical simulations and analysis when the algorithm is in the presence of decoherence. Then the computational complexity of the algorithm with decoherence is obtained. The results show that the ultimate effect of broken-link-type decoherence on the optimized quantum random-walk search algorithm is negative.
基金supported by the National Science Foundation (NSF) under Grants No.60832001,No.61271174 the National State Key Lab oratory of Integrated Service Network (ISN) under Grant No.ISN01080202
文摘To address the issue of field size in random network coding, we propose an Improved Adaptive Random Convolutional Network Coding (IARCNC) algorithm to considerably reduce the amount of occupied memory. The operation of IARCNC is similar to that of Adaptive Random Convolutional Network Coding (ARCNC), with the coefficients of local encoding kernels chosen uniformly at random over a small finite field. The difference is that the length of the local encoding kernels at the nodes used by IARCNC is constrained by the depth; meanwhile, increases until all the related sink nodes can be decoded. This restriction can make the code length distribution more reasonable. Therefore, IARCNC retains the advantages of ARCNC, such as a small decoding delay and partial adaptation to an unknown topology without an early estimation of the field size. In addition, it has its own advantage, that is, a higher reduction in memory use. The simulation and the example show the effectiveness of the proposed algorithm.
文摘Polynomial-time randomized algorithms were constructed to approximately solve optimal robust performance controller design problems in probabilistic sense and the rigorous mathematical justification of the approach was given. The randomized algorithms here were based on a property from statistical learning theory known as (uniform) convergence of empirical means (UCEM). It is argued that in order to assess the performance of a controller as the plant varies over a pre-specified family, it is better to use the average performance of the controller as the objective function to be optimized, rather than its worst-case performance. The approach is illustrated to be efficient through an example.
基金National Natural Science Foundation of China(No.61171179,No.61171178)Natural Science Foundation of Shanxi Province(No.2010011002-1,No.2010011002-2and No.2012021011-2)
文摘The order of the projection in the algebraic reconstruction technique(ART)method has great influence on the rate of the convergence.Although many scholars have studied the order of the projection,few theoretical proofs are given.Thomas Strohmer and Roman Vershynin introduced a randomized version of the Kaczmarz method for consistent,and over-determined linear systems and proved whose rate does not depend on the number of equations in the systems in 2009.In this paper,we apply this method to computed tomography(CT)image reconstruction and compared images generated by the sequential Kaczmarz method and the randomized Kaczmarz method.Experiments demonstrates the feasibility of the randomized Kaczmarz algorithm in CT image reconstruction and its exponential curve convergence.
基金the National Natural Science Foundation of China(51909136)the Open Research Fund of Key Laboratory of Geological Hazards on Three Gorges Reservoir Area(China Three Gorges University),Ministry of Education,Grant No.2022KDZ21Fund of National Major Water Conservancy Project Construction(0001212022CC60001)。
文摘The staggered distribution of joints and fissures in space constitutes the weak part of any rock mass.The identification of rock mass structural planes and the extraction of characteristic parameters are the basis of rock-mass integrity evaluation,which is very important for analysis of slope stability.The laser scanning technique can be used to acquire the coordinate information pertaining to each point of the structural plane,but large amount of point cloud data,uneven density distribution,and noise point interference make the identification efficiency and accuracy of different types of structural planes limited by point cloud data analysis technology.A new point cloud identification and segmentation algorithm for rock mass structural surfaces is proposed.Based on the distribution states of the original point cloud in different neighborhoods in space,the point clouds are characterized by multi-dimensional eigenvalues and calculated by the robust randomized Hough transform(RRHT).The normal vector difference and the final eigenvalue are proposed for characteristic distinction,and the identification of rock mass structural surfaces is completed through regional growth,which strengthens the difference expression of point clouds.In addition,nearest Voxel downsampling is also introduced in the RRHT calculation,which further reduces the number of sources of neighborhood noises,thereby improving the accuracy and stability of the calculation.The advantages of the method have been verified by laboratory models.The results showed that the proposed method can better achieve the segmentation and statistics of structural planes with interfaces and sharp boundaries.The method works well in the identification of joints,fissures,and other structural planes on Mangshezhai slope in the Three Gorges Reservoir area,China.It can provide a stable and effective technique for the identification and segmentation of rock mass structural planes,which is beneficial in engineering practice.
基金supported by Science and Technology Project of Fujian Provincial Department of Education under contract JAT170917Youth Science and Research Foundation of Chengyi College Jimei University under contract C16005.
文摘This paper presents an improved Randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transform. The experimental results denote that this algorithm can locate the circular mark of Printed Circuit Board (PCB).
文摘The traditional collaborative filtering recommendation technology has some shortcomings in the large data environment. To solve this problem, a personalized recommendation method based on cloud computing technology is proposed. The large data set and recommendation computation are decomposed into parallel processing on multiple computers. A parallel recommendation engine based on Hadoop open source framework is established, and the effectiveness of the system is validated by learning recommendation on an English training platform. The experimental results show that the scalability of the recommender system can be greatly improved by using cloud computing technology to handle massive data in the cluster. On the basis of the comparison of traditional recommendation algorithms, combined with the advantages of cloud computing, a personalized recommendation system based on cloud computing is proposed.
文摘This paper introduces the principle of genetic algorithm and the basic method of solving Markov random field parameters.Focusing on the shortcomings in present methods,a new method based on genetic algorithms is proposed to solve the parameters in the Markov random field.The detailed procedure is discussed.On the basis of the parameters solved by genetic algorithms,some experiments on classification of aerial images are given.Experimental results show that the proposed method is effective and the classification results are satisfactory.