Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough corr...Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough correlations before we calculate the correlation cost, so it avoids the operations for the target state estimate and the calculation of the correlation cost for the false correlation sets. In the meantime, with the elimination of these points in the rough correlation, the disturbance from the false correlations in the assignment process is decreased, so the data correlation accuracy is improved correspondingly. Complexity analyses of the new multi-layer optimal algorithm and the traditional optimal assignment algorithm are given. Simulation results show that the new algorithm is feasible and effective.展开更多
Aiming at the problem that the data-driven automatic correlation methods which are difficult to adapt to the automatic correlation of oil-bearing strata with large changes in lateral sedimentary facies and strata thic...Aiming at the problem that the data-driven automatic correlation methods which are difficult to adapt to the automatic correlation of oil-bearing strata with large changes in lateral sedimentary facies and strata thickness,an intelligent automatic correlation method of oil-bearing strata based on pattern constraints is formed.We propose to introduce knowledge-driven in automatic correlation of oil-bearing strata,constraining the correlation process by stratigraphic sedimentary patterns and improving the similarity measuring machine and conditional constraint dynamic time warping algorithm to automate the correlation of marker layers and the interfaces of each stratum.The application in Shishen 100 block in the Shinan Oilfield of the Bohai Bay Basin shows that the coincidence rate of the marker layers identified by this method is over 95.00%,and the average coincidence rate of identified oil-bearing strata reaches 90.02% compared to artificial correlation results,which is about 17 percentage points higher than that of the existing automatic correlation methods.The accuracy of the automatic correlation of oil-bearing strata has been effectively improved.展开更多
The safety factor is a crucial quantitative index for evaluating slope stability.However,the traditional calculation methods suffer from unreasonable assumptions,complex soil composition,and inadequate consideration o...The safety factor is a crucial quantitative index for evaluating slope stability.However,the traditional calculation methods suffer from unreasonable assumptions,complex soil composition,and inadequate consideration of the influencing factors,leading to large errors in their calculations.Therefore,a stacking ensemble learning model(stacking-SSAOP)based on multi-layer regression algorithm fusion and optimized by the sparrow search algorithm is proposed for predicting the slope safety factor.In this method,the density,cohesion,friction angle,slope angle,slope height,and pore pressure ratio are selected as characteristic parameters from the 210 sets of established slope sample data.Random Forest,Extra Trees,AdaBoost,Bagging,and Support Vector regression are used as the base model(inner loop)to construct the first-level regression algorithm layer,and XGBoost is used as the meta-model(outer loop)to construct the second-level regression algorithm layer and complete the construction of the stacked learning model for improving the model prediction accuracy.The sparrow search algorithm is used to optimize the hyperparameters of the above six regression models and correct the over-and underfitting problems of the single regression model to further improve the prediction accuracy.The mean square error(MSE)of the predicted and true values and the fitting of the data are compared and analyzed.The MSE of the stacking-SSAOP model was found to be smaller than that of the single regression model(MSE=0.03917).Therefore,the former has a higher prediction accuracy and better data fitting.This study innovatively applies the sparrow search algorithm to predict the slope safety factor,showcasing its advantages over traditional methods.Additionally,our proposed stacking-SSAOP model integrates multiple regression algorithms to enhance prediction accuracy.This model not only refines the prediction accuracy of the slope safety factor but also offers a fresh approach to handling the intricate soil composition and other influencing factors,making it a precise and reliable method for slope stability evaluation.This research holds importance for the modernization and digitalization of slope safety assessments.展开更多
Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and diffe...Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and different atmospheric conditions,such as mist,fog,dust etc.The pictures then shift in intensity,colour,polarity and consistency.A general challenge for computer vision analyses lies in the horrid appearance of night images in arbitrary illumination and ambient envir-onments.In recent years,target recognition techniques focused on deep learning and machine learning have become standard algorithms for object detection with the exponential growth of computer performance capabilities.However,the iden-tification of objects in the night world also poses further problems because of the distorted backdrop and dim light.The Correlation aware LSTM based YOLO(You Look Only Once)classifier method for exact object recognition and deter-mining its properties under night vision was a major inspiration for this work.In order to create virtual target sets similar to daily environments,we employ night images as inputs;and to obtain high enhanced image using histogram based enhancement and iterative wienerfilter for removing the noise in the image.The process of the feature extraction and feature selection was done for electing the potential features using the Adaptive internal linear embedding(AILE)and uplift linear discriminant analysis(ULDA).The region of interest mask can be segmen-ted using the Recurrent-Phase Level set Segmentation.Finally,we use deep con-volution feature fusion and region of interest pooling to integrate the presently extremely sophisticated quicker Long short term memory based(LSTM)with YOLO method for object tracking system.A range of experimentalfindings demonstrate that our technique achieves high average accuracy with a precision of 99.7%for object detection of SSAN datasets that is considerably more than that of the other standard object detection mechanism.Our approach may therefore satisfy the true demands of night scene target detection applications.We very much believe that our method will help future research.展开更多
Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.Th...Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.The inherent laws reflected by the historical data of the distribution network are ignored,which affects the objectivity of the planning scheme.In this study,to improve the efficiency and accuracy of distribution network planning,the characteristics of distribution network data were extracted using a data-mining technique,and correlation knowledge of existing problems in the network was obtained.A data-mining model based on correlation rules was established.The inputs of the model were the electrical characteristic indices screened using the gray correlation method.The Apriori algorithm was used to extract correlation knowledge from the operational data of the distribution network and obtain strong correlation rules.Degree of promotion and chi-square tests were used to verify the rationality of the strong correlation rules of the model output.In this study,the correlation relationship between heavy load or overload problems of distribution network feeders in different regions and related characteristic indices was determined,and the confidence of the correlation rules was obtained.These results can provide an effective basis for the formulation of a distribution network planning scheme.展开更多
The complexity of the actual operating environment of EMU trains and the interaction between the reliability of system components have become a huge challenge for the maintenance scheduling of EMU trains. In response ...The complexity of the actual operating environment of EMU trains and the interaction between the reliability of system components have become a huge challenge for the maintenance scheduling of EMU trains. In response to these problems, the evolution of reliability and failure rate under the influence of environmental factors, failure correlations and economy correlations is analyzed. We assume bogie systems form the EMU train in series. The failure correlation matrix of the bogie systems is modeled. With the lowest total maintenance cost as the optimization objective, a decision-making model for EMU train maintenance is established. A dynamic maintenance strategy is proposed for the model, which can improve maintenance plans efficiently. Artificial bee colony algorithm is applied to further iteratively optimize the threshold parameters in the strategy. The results are calculated and verified by a numerical example. The results show the effectiveness of the maintenance decision model. The dynamic maintenance strategy in this paper is compared with the traditional opportunistic maintenance strategy. The proposed maintenance strategy outperforms the traditional opportunistic maintenance strategy in the numerical example.展开更多
We investigate the correlations between two qubits in the Grover search algorithm with arbitrary initial states by numerical simulation.Using a set of suitable bases,we construct the reduced density matrix and give th...We investigate the correlations between two qubits in the Grover search algorithm with arbitrary initial states by numerical simulation.Using a set of suitable bases,we construct the reduced density matrix and give the numerical expression of correlations relating to the iterations.For different initial states,we obtain the concurrence and quantum discord compared with the success probability in the algorithm.The results show that the initial states affect the correlations and the limit point of the correlations in the searching process.However,the initial states do not influence the whole cyclical trend.展开更多
The digital speckle correlation method is an important optical metrology for surface displacement and strain measurement. With this technique, the whole field deformation information can be obtained by tracking the ge...The digital speckle correlation method is an important optical metrology for surface displacement and strain measurement. With this technique, the whole field deformation information can be obtained by tracking the geometric points on the speckle images based on a correlation-matching search technique. However, general search techniques suffer from great computational complexity in the processing of speckle images with large deformation and the large random errors in the processing of images of bad quality. In this paper, an advanced approach based on genetic algorithms (GA) for correlation-matching search is developed. Benefiting from the abilities of global optimum and parallelism searching of GA, this new approach can complete the correlation-matching search with less computational consumption and at high accuracy. Two experimental results from the simulated speckle images have proved the efficiency of the new approach.展开更多
In this work, Kendall correlation based collaborative filtering algorithms for the recommender systems are proposed. The Kendall correlation method is used to measure the correlation amongst users by means of consider...In this work, Kendall correlation based collaborative filtering algorithms for the recommender systems are proposed. The Kendall correlation method is used to measure the correlation amongst users by means of considering the relative order of the users’ ratings. Kendall based algorithm is based upon a more general model and thus could be more widely applied in e-commerce. Another discovery of this work is that the consideration of only positive correlated neighbors in prediction, in both Pearson and Kendall algorithms, achieves higher accuracy than the consideration of all neighbors, with only a small loss of coverage.展开更多
A predictive search algorithm to estimate the size and direction of displacement vectors was presented.The algorithm decreased the time of calculating the displacement of each pixel.In addition,the updating reference ...A predictive search algorithm to estimate the size and direction of displacement vectors was presented.The algorithm decreased the time of calculating the displacement of each pixel.In addition,the updating reference image scheme was used to update the reference image and to decrease the computation time when the displacement was larger than a certain number.In this way,the search range and computational complexity were cut down,and less EMS memory was occupied.The capability of proposed search algorithm was then verified by the results of both computer simulation and experiments.The results showed that the algorithm could improve the efficiency of correlation method and satisfy the accuracy requirement for practical displacement measuring.展开更多
When the chaotic characteristics of manufacturing quality level are studied, it is not practical to use chaotic methods because of the low speed of calculating the correlation integral. The original algorithm used to ...When the chaotic characteristics of manufacturing quality level are studied, it is not practical to use chaotic methods because of the low speed of calculating the correlation integral. The original algorithm used to calculate the correlation integral is studied after a computer hardware upgrade. The result is that calculation of the correlation integral can be sped up only by improving the algorithm. This is accomplished by changing the original algorithm in which a single distance threshold-related correlation integral is obtained from one traversal of all distances between different vectors to a high-efficiency algorithm in which all of the distance threshold-related correlation integrals are obtained from one traversal of all of the distances between different vectors. For a time series with 3000 data points, this high-efficiency algorithm offers a 3.7-fold increase in speed over the original algorithm. Further study of the high-efficiency algorithm leads to the development of a super-high-efficiency algorithm, which is accomplished by changing the original and high-efficiency algorithms, in which the add-one operation of the Heaviside function is executed n times, such that the execution of the add-one operation occurs only once. The super-high-efficiency algorithm results in increases in the calculation speed by up to 109 times compared with the high-efficiency algorithm and by approximately 404 times compared with the original algorithm. The calculation speed of the super-high-efficiency algorithm is suitable for practical use with the chaotic method.展开更多
Differential evolution algorithm based on the covariance matrix learning can adjust the coordinate system according to the characteristics of the population, which make<span style="font-family:Verdana;"&g...Differential evolution algorithm based on the covariance matrix learning can adjust the coordinate system according to the characteristics of the population, which make<span style="font-family:Verdana;">s</span><span style="font-family:Verdana;"> the search move in a more favorable direction. In order to obtain more accurate information about the function shape, this paper propose</span><span style="font-family:Verdana;">s</span><span style="font-family:;" "=""> <span style="font-family:Verdana;">covariance</span><span style="font-family:Verdana;"> matrix learning differential evolution algorithm based on correlation (denoted as RCLDE)</span></span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">to improve the search efficiency of the algorithm. First, a hybrid mutation strategy is designed to balance the diversity and convergence of the population;secondly, the covariance learning matrix is constructed by selecting the individual with the less correlation;then, a comprehensive learning mechanism is comprehensively designed by two covariance matrix learning mechanisms based on the principle of probability. Finally,</span><span style="font-family:;" "=""> </span><span style="font-family:;" "=""><span style="font-family:Verdana;">the algorithm is tested on the CEC2005, and the experimental results are compared with other effective differential evolution algorithms. The experimental results show that the algorithm proposed in this paper is </span><span style="font-family:Verdana;">an effective algorithm</span><span style="font-family:Verdana;">.</span></span>展开更多
Global look-up table strategy proposed recently has been proven to be an efficient method to accelerate the interpolation, which is the most time-consuming part in the iterative sub-pixel digital image correlation(DIC...Global look-up table strategy proposed recently has been proven to be an efficient method to accelerate the interpolation, which is the most time-consuming part in the iterative sub-pixel digital image correlation(DIC) algorithms. In this paper, a global look-up table strategy with cubic B-spline interpolation is developed for the DIC method based on the inverse compositional Gauss–Newton(IC-GN) algorithm.The performance of this strategy, including accuracy, precision, and computation efficiency, is evaluated through a theoretical and experimental study, using the one with widely employed bicubic interpolation as a benchmark. The global look-up table strategy with cubic B-spline interpolation improves significantly the accuracy of the IC-GN algorithm-based DIC method compared with the one using the bicubic interpolation, at a trivial price of computation efficiency.展开更多
In mining or construction projects,for exploitation of hard rock with high strength properties,blasting is frequently applied to breaking or moving them using high explosive energy.However,use of explosives may lead t...In mining or construction projects,for exploitation of hard rock with high strength properties,blasting is frequently applied to breaking or moving them using high explosive energy.However,use of explosives may lead to the flyrock phenomenon.Flyrock can damage structures or nearby equipment in the surrounding areas and inflict harm to humans,especially workers in the working sites.Thus,prediction of flyrock is of high importance.In this investigation,examination and estimation/forecast of flyrock distance induced by blasting through the application of five artificial intelligent algorithms were carried out.One hundred and fifty-two blasting events in three open-pit granite mines in Johor,Malaysia,were monitored to collect field data.The collected data include blasting parameters and rock mass properties.Site-specific weathering index(WI),geological strength index(GSI) and rock quality designation(RQD)are rock mass properties.Multi-layer perceptron(MLP),random forest(RF),support vector machine(SVM),and hybrid models including Harris Hawks optimization-based MLP(known as HHO-MLP) and whale optimization algorithm-based MLP(known as WOA-MLP) were developed.The performance of various models was assessed through various performance indices,including a10-index,coefficient of determination(R^(2)),root mean squared error(RMSE),mean absolute percentage error(MAPE),variance accounted for(VAF),and root squared error(RSE).The a10-index values for MLP,RF,SVM,HHO-MLP and WOA-MLP are 0.953,0.933,0.937,0.991 and 0.972,respectively.R^(2) of HHO-MLP is 0.998,which achieved the best performance among all five machine learning(ML) models.展开更多
Finding effective cancer treatment is a challenge, because the sensitivity of the cancer stems from both intrinsic cellular properties and acquired resistances from prior treatment. Previous research has revealed indi...Finding effective cancer treatment is a challenge, because the sensitivity of the cancer stems from both intrinsic cellular properties and acquired resistances from prior treatment. Previous research has revealed individual protein markers that are significant to chemosensitivity prediction. Our goal is to find correlated protein markers which are collectively significant to chemosensitivity prediction to complement the individual markers already reported. In order to do this, we used the D’ correlation measurement to study the feature selection correlations for chemosensitivity prediction of 118 anticancer agents with putatively known mechanisms of action. Three data-sets on the NCI-60 were utilized in this study: two protein datasets, one previously studied for chemosensitivity prediction and another novel to this topic, and one DNA copy number dataset. To validate our approach, we identified the protein markers that were strongly correlated by our analysis with the individual protein markers found in previous studies. Our feature analysis discovered highly correlated protein marker pairs, based on which we found individual protein markers with medical significance. While some of the markers uncovered were consistent with those previously reported, others were original to this work. Using these marker pairs we were able to further correlate the cellular functions associated with them. As an exploratory analysis, we discovered feature selection correlation patterns between and within different drug mechanisms of action for each of our datasets. In conclusion, the highly correlated protein marker pairs as well as their functions found by our feature analysis are validated by previous studies, and are shown to be medically significant, demonstrating D’ as an effective measurement of correlation in the context of feature selection for the first time.展开更多
In the early exploration of many oilfields,low-resistivity-low-contrast(LRLC)pay zones are easily overlooked due to the resistivity similarity to the water zones.Existing identification methods are model-driven and ca...In the early exploration of many oilfields,low-resistivity-low-contrast(LRLC)pay zones are easily overlooked due to the resistivity similarity to the water zones.Existing identification methods are model-driven and cannot yield satisfactory results when the causes of LRLC pay zones are complicated.In this study,after analyzing a large number of core samples,main causes of LRLC pay zones in the study area are discerned,which include complex distribution of formation water salinity,high irreducible water saturation due to micropores,and high shale volume.Moreover,different oil testing layers may have different causes of LRLC pay zones.As a result,in addition to the well log data of oil testing layers,well log data of adjacent shale layers are also added to the original dataset as reference data.The densitybased spatial clustering algorithm with noise(DBSCAN)is used to cluster the original dataset into 49 clusters.A new dataset is ultimately projected into a feature space with 49 dimensions.The new dataset and oil testing results are respectively treated as input and output to train the multi-layer perceptron(MLP).A total of 3192 samples are used for stratified 8-fold cross-validation,and the accuracy of the MLP is found to be 85.53%.展开更多
基金This project was supported by the National Natural Science Foundation of China (60672139, 60672140)the Excellent Ph.D. Paper Author Foundation of China (200237)the Natural Science Foundation of Shandong (2005ZX01).
文摘Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough correlations before we calculate the correlation cost, so it avoids the operations for the target state estimate and the calculation of the correlation cost for the false correlation sets. In the meantime, with the elimination of these points in the rough correlation, the disturbance from the false correlations in the assignment process is decreased, so the data correlation accuracy is improved correspondingly. Complexity analyses of the new multi-layer optimal algorithm and the traditional optimal assignment algorithm are given. Simulation results show that the new algorithm is feasible and effective.
基金Supported by the National Natural Science Foundation of China(42272110)CNPC-China University of Petroleum(Beijing)Strategic Cooperation Project(ZLZX2020-02).
文摘Aiming at the problem that the data-driven automatic correlation methods which are difficult to adapt to the automatic correlation of oil-bearing strata with large changes in lateral sedimentary facies and strata thickness,an intelligent automatic correlation method of oil-bearing strata based on pattern constraints is formed.We propose to introduce knowledge-driven in automatic correlation of oil-bearing strata,constraining the correlation process by stratigraphic sedimentary patterns and improving the similarity measuring machine and conditional constraint dynamic time warping algorithm to automate the correlation of marker layers and the interfaces of each stratum.The application in Shishen 100 block in the Shinan Oilfield of the Bohai Bay Basin shows that the coincidence rate of the marker layers identified by this method is over 95.00%,and the average coincidence rate of identified oil-bearing strata reaches 90.02% compared to artificial correlation results,which is about 17 percentage points higher than that of the existing automatic correlation methods.The accuracy of the automatic correlation of oil-bearing strata has been effectively improved.
基金supported by the Basic Research Special Plan of Yunnan Provincial Department of Science and Technology-General Project(Grant No.202101AT070094)。
文摘The safety factor is a crucial quantitative index for evaluating slope stability.However,the traditional calculation methods suffer from unreasonable assumptions,complex soil composition,and inadequate consideration of the influencing factors,leading to large errors in their calculations.Therefore,a stacking ensemble learning model(stacking-SSAOP)based on multi-layer regression algorithm fusion and optimized by the sparrow search algorithm is proposed for predicting the slope safety factor.In this method,the density,cohesion,friction angle,slope angle,slope height,and pore pressure ratio are selected as characteristic parameters from the 210 sets of established slope sample data.Random Forest,Extra Trees,AdaBoost,Bagging,and Support Vector regression are used as the base model(inner loop)to construct the first-level regression algorithm layer,and XGBoost is used as the meta-model(outer loop)to construct the second-level regression algorithm layer and complete the construction of the stacked learning model for improving the model prediction accuracy.The sparrow search algorithm is used to optimize the hyperparameters of the above six regression models and correct the over-and underfitting problems of the single regression model to further improve the prediction accuracy.The mean square error(MSE)of the predicted and true values and the fitting of the data are compared and analyzed.The MSE of the stacking-SSAOP model was found to be smaller than that of the single regression model(MSE=0.03917).Therefore,the former has a higher prediction accuracy and better data fitting.This study innovatively applies the sparrow search algorithm to predict the slope safety factor,showcasing its advantages over traditional methods.Additionally,our proposed stacking-SSAOP model integrates multiple regression algorithms to enhance prediction accuracy.This model not only refines the prediction accuracy of the slope safety factor but also offers a fresh approach to handling the intricate soil composition and other influencing factors,making it a precise and reliable method for slope stability evaluation.This research holds importance for the modernization and digitalization of slope safety assessments.
文摘Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and different atmospheric conditions,such as mist,fog,dust etc.The pictures then shift in intensity,colour,polarity and consistency.A general challenge for computer vision analyses lies in the horrid appearance of night images in arbitrary illumination and ambient envir-onments.In recent years,target recognition techniques focused on deep learning and machine learning have become standard algorithms for object detection with the exponential growth of computer performance capabilities.However,the iden-tification of objects in the night world also poses further problems because of the distorted backdrop and dim light.The Correlation aware LSTM based YOLO(You Look Only Once)classifier method for exact object recognition and deter-mining its properties under night vision was a major inspiration for this work.In order to create virtual target sets similar to daily environments,we employ night images as inputs;and to obtain high enhanced image using histogram based enhancement and iterative wienerfilter for removing the noise in the image.The process of the feature extraction and feature selection was done for electing the potential features using the Adaptive internal linear embedding(AILE)and uplift linear discriminant analysis(ULDA).The region of interest mask can be segmen-ted using the Recurrent-Phase Level set Segmentation.Finally,we use deep con-volution feature fusion and region of interest pooling to integrate the presently extremely sophisticated quicker Long short term memory based(LSTM)with YOLO method for object tracking system.A range of experimentalfindings demonstrate that our technique achieves high average accuracy with a precision of 99.7%for object detection of SSAN datasets that is considerably more than that of the other standard object detection mechanism.Our approach may therefore satisfy the true demands of night scene target detection applications.We very much believe that our method will help future research.
基金supported by the Science and Technology Project of China Southern Power Grid(GZHKJXM20210043-080041KK52210002).
文摘Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.The inherent laws reflected by the historical data of the distribution network are ignored,which affects the objectivity of the planning scheme.In this study,to improve the efficiency and accuracy of distribution network planning,the characteristics of distribution network data were extracted using a data-mining technique,and correlation knowledge of existing problems in the network was obtained.A data-mining model based on correlation rules was established.The inputs of the model were the electrical characteristic indices screened using the gray correlation method.The Apriori algorithm was used to extract correlation knowledge from the operational data of the distribution network and obtain strong correlation rules.Degree of promotion and chi-square tests were used to verify the rationality of the strong correlation rules of the model output.In this study,the correlation relationship between heavy load or overload problems of distribution network feeders in different regions and related characteristic indices was determined,and the confidence of the correlation rules was obtained.These results can provide an effective basis for the formulation of a distribution network planning scheme.
基金Sponsored by the National Natural Science Foundation of China(Grant No. 72061022)。
文摘The complexity of the actual operating environment of EMU trains and the interaction between the reliability of system components have become a huge challenge for the maintenance scheduling of EMU trains. In response to these problems, the evolution of reliability and failure rate under the influence of environmental factors, failure correlations and economy correlations is analyzed. We assume bogie systems form the EMU train in series. The failure correlation matrix of the bogie systems is modeled. With the lowest total maintenance cost as the optimization objective, a decision-making model for EMU train maintenance is established. A dynamic maintenance strategy is proposed for the model, which can improve maintenance plans efficiently. Artificial bee colony algorithm is applied to further iteratively optimize the threshold parameters in the strategy. The results are calculated and verified by a numerical example. The results show the effectiveness of the maintenance decision model. The dynamic maintenance strategy in this paper is compared with the traditional opportunistic maintenance strategy. The proposed maintenance strategy outperforms the traditional opportunistic maintenance strategy in the numerical example.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.11975132 and 61772295)the Natural Science Foundation of Shandong Province,China(Grant No.ZR2019YQ01)Shandong Province Higher Educational Science and Technology Program,China(Grant No.J18KZ012).
文摘We investigate the correlations between two qubits in the Grover search algorithm with arbitrary initial states by numerical simulation.Using a set of suitable bases,we construct the reduced density matrix and give the numerical expression of correlations relating to the iterations.For different initial states,we obtain the concurrence and quantum discord compared with the success probability in the algorithm.The results show that the initial states affect the correlations and the limit point of the correlations in the searching process.However,the initial states do not influence the whole cyclical trend.
基金Project supported by the National Natural Science Foundation of China(No.19772033)the Research Innovation Fund of Tsinghua University for Ph.D.Candidates(No.092410048).
文摘The digital speckle correlation method is an important optical metrology for surface displacement and strain measurement. With this technique, the whole field deformation information can be obtained by tracking the geometric points on the speckle images based on a correlation-matching search technique. However, general search techniques suffer from great computational complexity in the processing of speckle images with large deformation and the large random errors in the processing of images of bad quality. In this paper, an advanced approach based on genetic algorithms (GA) for correlation-matching search is developed. Benefiting from the abilities of global optimum and parallelism searching of GA, this new approach can complete the correlation-matching search with less computational consumption and at high accuracy. Two experimental results from the simulated speckle images have proved the efficiency of the new approach.
基金Supported by the National Natural Science Foun-dation of China (60573095)
文摘In this work, Kendall correlation based collaborative filtering algorithms for the recommender systems are proposed. The Kendall correlation method is used to measure the correlation amongst users by means of considering the relative order of the users’ ratings. Kendall based algorithm is based upon a more general model and thus could be more widely applied in e-commerce. Another discovery of this work is that the consideration of only positive correlated neighbors in prediction, in both Pearson and Kendall algorithms, achieves higher accuracy than the consideration of all neighbors, with only a small loss of coverage.
文摘A predictive search algorithm to estimate the size and direction of displacement vectors was presented.The algorithm decreased the time of calculating the displacement of each pixel.In addition,the updating reference image scheme was used to update the reference image and to decrease the computation time when the displacement was larger than a certain number.In this way,the search range and computational complexity were cut down,and less EMS memory was occupied.The capability of proposed search algorithm was then verified by the results of both computer simulation and experiments.The results showed that the algorithm could improve the efficiency of correlation method and satisfy the accuracy requirement for practical displacement measuring.
文摘When the chaotic characteristics of manufacturing quality level are studied, it is not practical to use chaotic methods because of the low speed of calculating the correlation integral. The original algorithm used to calculate the correlation integral is studied after a computer hardware upgrade. The result is that calculation of the correlation integral can be sped up only by improving the algorithm. This is accomplished by changing the original algorithm in which a single distance threshold-related correlation integral is obtained from one traversal of all distances between different vectors to a high-efficiency algorithm in which all of the distance threshold-related correlation integrals are obtained from one traversal of all of the distances between different vectors. For a time series with 3000 data points, this high-efficiency algorithm offers a 3.7-fold increase in speed over the original algorithm. Further study of the high-efficiency algorithm leads to the development of a super-high-efficiency algorithm, which is accomplished by changing the original and high-efficiency algorithms, in which the add-one operation of the Heaviside function is executed n times, such that the execution of the add-one operation occurs only once. The super-high-efficiency algorithm results in increases in the calculation speed by up to 109 times compared with the high-efficiency algorithm and by approximately 404 times compared with the original algorithm. The calculation speed of the super-high-efficiency algorithm is suitable for practical use with the chaotic method.
文摘Differential evolution algorithm based on the covariance matrix learning can adjust the coordinate system according to the characteristics of the population, which make<span style="font-family:Verdana;">s</span><span style="font-family:Verdana;"> the search move in a more favorable direction. In order to obtain more accurate information about the function shape, this paper propose</span><span style="font-family:Verdana;">s</span><span style="font-family:;" "=""> <span style="font-family:Verdana;">covariance</span><span style="font-family:Verdana;"> matrix learning differential evolution algorithm based on correlation (denoted as RCLDE)</span></span><span style="font-family:;" "=""> </span><span style="font-family:Verdana;">to improve the search efficiency of the algorithm. First, a hybrid mutation strategy is designed to balance the diversity and convergence of the population;secondly, the covariance learning matrix is constructed by selecting the individual with the less correlation;then, a comprehensive learning mechanism is comprehensively designed by two covariance matrix learning mechanisms based on the principle of probability. Finally,</span><span style="font-family:;" "=""> </span><span style="font-family:;" "=""><span style="font-family:Verdana;">the algorithm is tested on the CEC2005, and the experimental results are compared with other effective differential evolution algorithms. The experimental results show that the algorithm proposed in this paper is </span><span style="font-family:Verdana;">an effective algorithm</span><span style="font-family:Verdana;">.</span></span>
基金financially supported by the National Natural Science Foundation of China(11202081,11272124,and 11472109)the State Key Lab of Subtropical Building Science,South China University of Technology(2014ZC17)
文摘Global look-up table strategy proposed recently has been proven to be an efficient method to accelerate the interpolation, which is the most time-consuming part in the iterative sub-pixel digital image correlation(DIC) algorithms. In this paper, a global look-up table strategy with cubic B-spline interpolation is developed for the DIC method based on the inverse compositional Gauss–Newton(IC-GN) algorithm.The performance of this strategy, including accuracy, precision, and computation efficiency, is evaluated through a theoretical and experimental study, using the one with widely employed bicubic interpolation as a benchmark. The global look-up table strategy with cubic B-spline interpolation improves significantly the accuracy of the IC-GN algorithm-based DIC method compared with the one using the bicubic interpolation, at a trivial price of computation efficiency.
基金supported by the Center for Mining,Electro-Mechanical Research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnam。
文摘In mining or construction projects,for exploitation of hard rock with high strength properties,blasting is frequently applied to breaking or moving them using high explosive energy.However,use of explosives may lead to the flyrock phenomenon.Flyrock can damage structures or nearby equipment in the surrounding areas and inflict harm to humans,especially workers in the working sites.Thus,prediction of flyrock is of high importance.In this investigation,examination and estimation/forecast of flyrock distance induced by blasting through the application of five artificial intelligent algorithms were carried out.One hundred and fifty-two blasting events in three open-pit granite mines in Johor,Malaysia,were monitored to collect field data.The collected data include blasting parameters and rock mass properties.Site-specific weathering index(WI),geological strength index(GSI) and rock quality designation(RQD)are rock mass properties.Multi-layer perceptron(MLP),random forest(RF),support vector machine(SVM),and hybrid models including Harris Hawks optimization-based MLP(known as HHO-MLP) and whale optimization algorithm-based MLP(known as WOA-MLP) were developed.The performance of various models was assessed through various performance indices,including a10-index,coefficient of determination(R^(2)),root mean squared error(RMSE),mean absolute percentage error(MAPE),variance accounted for(VAF),and root squared error(RSE).The a10-index values for MLP,RF,SVM,HHO-MLP and WOA-MLP are 0.953,0.933,0.937,0.991 and 0.972,respectively.R^(2) of HHO-MLP is 0.998,which achieved the best performance among all five machine learning(ML) models.
文摘Finding effective cancer treatment is a challenge, because the sensitivity of the cancer stems from both intrinsic cellular properties and acquired resistances from prior treatment. Previous research has revealed individual protein markers that are significant to chemosensitivity prediction. Our goal is to find correlated protein markers which are collectively significant to chemosensitivity prediction to complement the individual markers already reported. In order to do this, we used the D’ correlation measurement to study the feature selection correlations for chemosensitivity prediction of 118 anticancer agents with putatively known mechanisms of action. Three data-sets on the NCI-60 were utilized in this study: two protein datasets, one previously studied for chemosensitivity prediction and another novel to this topic, and one DNA copy number dataset. To validate our approach, we identified the protein markers that were strongly correlated by our analysis with the individual protein markers found in previous studies. Our feature analysis discovered highly correlated protein marker pairs, based on which we found individual protein markers with medical significance. While some of the markers uncovered were consistent with those previously reported, others were original to this work. Using these marker pairs we were able to further correlate the cellular functions associated with them. As an exploratory analysis, we discovered feature selection correlation patterns between and within different drug mechanisms of action for each of our datasets. In conclusion, the highly correlated protein marker pairs as well as their functions found by our feature analysis are validated by previous studies, and are shown to be medically significant, demonstrating D’ as an effective measurement of correlation in the context of feature selection for the first time.
基金funded by the Strategic Cooperation Technology Projects of CNPC and CUPB(ZLZX2020-03)
文摘In the early exploration of many oilfields,low-resistivity-low-contrast(LRLC)pay zones are easily overlooked due to the resistivity similarity to the water zones.Existing identification methods are model-driven and cannot yield satisfactory results when the causes of LRLC pay zones are complicated.In this study,after analyzing a large number of core samples,main causes of LRLC pay zones in the study area are discerned,which include complex distribution of formation water salinity,high irreducible water saturation due to micropores,and high shale volume.Moreover,different oil testing layers may have different causes of LRLC pay zones.As a result,in addition to the well log data of oil testing layers,well log data of adjacent shale layers are also added to the original dataset as reference data.The densitybased spatial clustering algorithm with noise(DBSCAN)is used to cluster the original dataset into 49 clusters.A new dataset is ultimately projected into a feature space with 49 dimensions.The new dataset and oil testing results are respectively treated as input and output to train the multi-layer perceptron(MLP).A total of 3192 samples are used for stratified 8-fold cross-validation,and the accuracy of the MLP is found to be 85.53%.