Offboard active decoys(OADs)can effectively jam monopulse radars.However,for missiles approaching from a particular direction and distance,the OAD should be placed at a specific location,posing high requirements for t...Offboard active decoys(OADs)can effectively jam monopulse radars.However,for missiles approaching from a particular direction and distance,the OAD should be placed at a specific location,posing high requirements for timing and deployment.To improve the response speed and jamming effect,a cluster of OADs based on an unmanned surface vehicle(USV)is proposed.The formation of the cluster determines the effectiveness of jamming.First,based on the mechanism of OAD jamming,critical conditions are identified,and a method for assessing the jamming effect is proposed.Then,for the optimization of the cluster formation,a mathematical model is built,and a multi-tribe adaptive particle swarm optimization algorithm based on mutation strategy and Metropolis criterion(3M-APSO)is designed.Finally,the formation optimization problem is solved and analyzed using the 3M-APSO algorithm under specific scenarios.The results show that the improved algorithm has a faster convergence rate and superior performance as compared to the standard Adaptive-PSO algorithm.Compared with a single OAD,the optimal formation of USV-OAD cluster effectively fills the blind area and maximizes the use of jamming resources.展开更多
Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning probl...Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.展开更多
Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Fi...Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Firstly,the Golden Sine algorithm and a nonlinear weight factor optimization strategy were added in the discoverer position update stage of the SSA algorithm.Secondly,the Cauchy-Gaussian perturbation was applied to the optimal position of the SSA algorithm to improve its ability to jump out of local optima.Finally,the local search mechanism based on the mountain climbing method was incorporated into the local search stage of the SSA algorithm,improving its local search ability.To evaluate the effectiveness of the proposed algorithm,the Whale Algorithm,Gray Wolf Algorithm,Improved Gray Wolf Algorithm,Sparrow Search Algorithm,and MSSA Algorithm were employed to solve various test functions.The accuracy and convergence speed of each algorithm were then compared and analyzed.The results indicate that the MSSA algorithm has superior solving ability and stability compared to other algorithms.To further validate the enhanced algorithm’s capabilities for path planning,evacuation experiments were conducted using different maps featuring various obstacle types.Additionally,a multi-exit evacuation scenario was constructed according to the actual building environment of a teaching building.Both the sparrow search algorithm and MSSA algorithm were employed in the simulation experiment for multiexit evacuation path planning.The findings demonstrate that the MSSA algorithm outperforms the comparison algorithm,showcasing its greater advantages and higher application potential.展开更多
The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource pr...The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.展开更多
With the rapid development of digital information technology,images are increasingly used in various fields.To ensure the security of image data,prevent unauthorized tampering and leakage,maintain personal privacy,and...With the rapid development of digital information technology,images are increasingly used in various fields.To ensure the security of image data,prevent unauthorized tampering and leakage,maintain personal privacy,and protect intellectual property rights,this study proposes an innovative color image encryption algorithm.Initially,the Mersenne Twister algorithm is utilized to generate high-quality pseudo-random numbers,establishing a robust basis for subsequent operations.Subsequently,two distinct chaotic systems,the autonomous non-Hamiltonian chaotic system and the tentlogistic-cosine chaotic mapping,are employed to produce chaotic random sequences.These chaotic sequences are used to control the encoding and decoding process of the DNA,effectively scrambling the image pixels.Furthermore,the complexity of the encryption process is enhanced through improved Joseph block scrambling.Thorough experimental verification,research,and analysis,the average value of the information entropy test data reaches as high as 7.999.Additionally,the average value of the number of pixels change rate(NPCR)test data is 99.6101%,which closely approaches the ideal value of 99.6094%.This algorithm not only guarantees image quality but also substantially raises the difficulty of decryption.展开更多
Lung cancer is among the most frequent cancers in the world,with over one million deaths per year.Classification is required for lung cancer diagnosis and therapy to be effective,accurate,and reliable.Gene expression ...Lung cancer is among the most frequent cancers in the world,with over one million deaths per year.Classification is required for lung cancer diagnosis and therapy to be effective,accurate,and reliable.Gene expression microarrays have made it possible to find genetic biomarkers for cancer diagnosis and prediction in a high-throughput manner.Machine Learning(ML)has been widely used to diagnose and classify lung cancer where the performance of ML methods is evaluated to identify the appropriate technique.Identifying and selecting the gene expression patterns can help in lung cancer diagnoses and classification.Normally,microarrays include several genes and may cause confusion or false prediction.Therefore,the Arithmetic Optimization Algorithm(AOA)is used to identify the optimal gene subset to reduce the number of selected genes.Which can allow the classifiers to yield the best performance for lung cancer classification.In addition,we proposed a modified version of AOA which can work effectively on the high dimensional dataset.In the modified AOA,the features are ranked by their weights and are used to initialize the AOA population.The exploitation process of AOA is then enhanced by developing a local search algorithm based on two neighborhood strategies.Finally,the efficiency of the proposed methods was evaluated on gene expression datasets related to Lung cancer using stratified 4-fold cross-validation.The method’s efficacy in selecting the optimal gene subset is underscored by its ability to maintain feature proportions between 10%to 25%.Moreover,the approach significantly enhances lung cancer prediction accuracy.For instance,Lung_Harvard1 achieved an accuracy of 97.5%,Lung_Harvard2 and Lung_Michigan datasets both achieved 100%,Lung_Adenocarcinoma obtained an accuracy of 88.2%,and Lung_Ontario achieved an accuracy of 87.5%.In conclusion,the results indicate the potential promise of the proposed modified AOA approach in classifying microarray cancer data.展开更多
The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optim...The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.展开更多
Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been...Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.展开更多
A hard problem that hinders the movement of waxy crude oil is wax deposition in oil pipelines.To ensure the safe operation of crude oil pipelines,an accurate model must be developed to predict the rate of wax depositi...A hard problem that hinders the movement of waxy crude oil is wax deposition in oil pipelines.To ensure the safe operation of crude oil pipelines,an accurate model must be developed to predict the rate of wax deposition in crude oil pipelines.Aiming at the shortcomings of the ENN prediction model,which easily falls into the local minimum value and weak generalization ability in the implementation process,an optimized ENN prediction model based on the IRSA is proposed.The validity of the new model was confirmed by the accurate prediction of two sets of experimental data on wax deposition in crude oil pipelines.The two groups of crude oil wax deposition rate case prediction results showed that the average absolute percentage errors of IRSA-ENN prediction models is 0.5476% and 0.7831%,respectively.Additionally,it shows a higher prediction accuracy compared to the ENN prediction model.In fact,the new model established by using the IRSA to optimize ENN can optimize the initial weights and thresholds in the prediction process,which can overcome the shortcomings of the ENN prediction model,such as weak generalization ability and tendency to fall into the local minimum value,so that it has the advantages of strong implementation and high prediction accuracy.展开更多
When applying Grover's algorithm to an unordered database, the probabifity of obtaining correct results usually decreases as the quantity of target increases. A four-phase improvement of Grover's algorithm is propos...When applying Grover's algorithm to an unordered database, the probabifity of obtaining correct results usually decreases as the quantity of target increases. A four-phase improvement of Grover's algorithm is proposed to fix the deficiency, and the unitary and the phase-matching condition are also proposed. With this improved scheme, when the proportion of target is over 1/3, the probability of obtaining correct results is greater than 97.82% with only one iteration using two phases. When the computational complexity is O( √M/N), the algorithm can succeed with a probability no less than 99.63%.展开更多
The dense and accurate measurement of 3D texture is helpful in evaluating the pavement function.To form dense mandatory constraints and improve matching accuracy,the traditional binocular reconstruction technology was...The dense and accurate measurement of 3D texture is helpful in evaluating the pavement function.To form dense mandatory constraints and improve matching accuracy,the traditional binocular reconstruction technology was improved threefold.First,a single moving laser line was introduced to carry out global scanning constraints on the target,which would well overcome the difficulty of installing and recognizing excessive laser lines.Second,four kinds of improved algorithms,namely,disparity replacement,superposition synthesis,subregion segmentation,and subregion segmentation centroid enhancement,were established based on different constraint mechanism.Last,the improved binocular reconstruction test device was developed to realize the dual functions of 3D texture measurement and precision self-evaluation.Results show that compared with traditional algorithms,the introduction of a single laser line scanning constraint is helpful in improving the measurement’s accuracy.Among various improved algorithms,the improvement effect of the subregion segmentation centroid enhancement method is the best.It has a good effect on both overall measurement and single pointmeasurement,which can be considered to be used in pavement function evaluation.展开更多
Web quality of service (QoS) awareness requires not only the selection of specific services to complete specific tasks, but also the comprehensive quality of service of the whole web service composition. How to select...Web quality of service (QoS) awareness requires not only the selection of specific services to complete specific tasks, but also the comprehensive quality of service of the whole web service composition. How to select the web service composition with the highest comprehensive QoS is a NP hard problem. In this paper, an improved multi population genetic algorithm is proposed. Cosine adaptive operator is added to the algorithm to avoid premature algorithm caused by improper genetic operator and the disadvantage of destroying excellent individuals in later period. Experimental results show that compared with the common genetic algorithm and multi population genetic algorithm, this algorithm has the advantages of shorter time consumption and higher accuracy, and effectively avoids the loss of effective genes in the population.展开更多
Symmetric workpiece localization algorithms combine alternating optimization and linearization. The iterative variables are partitioned into two groups. Then simple optimization approaches can be employed for each sub...Symmetric workpiece localization algorithms combine alternating optimization and linearization. The iterative variables are partitioned into two groups. Then simple optimization approaches can be employed for each subset of variables, where optimization of configuration variables is simplified as a linear least-squares problem (LSP). Convergence of current symmetric localization algorithms is discussed firstly. It is shown that simply taking the solution of the LSP as start of the next iteration may result in divergence or incorrect convergence. Therefore in our enhanced algorithms, line search is performed along the solution of the LSP in order to find a better point reducing the value of objective function. We choose this point as start of the next iteration. Better convergence is verified by numerical simulation. Besides, imposing boundary constraints on the LSP proves to be another efficient way.展开更多
Community detection methods have been used in computer, sociology, physics, biology, and brain information science areas. Many methods are based on the optimization of modularity. The algorithm proposed by Blondel et ...Community detection methods have been used in computer, sociology, physics, biology, and brain information science areas. Many methods are based on the optimization of modularity. The algorithm proposed by Blondel et al. (Blondel V D, Guillaume J L, Lambiotte R and Lefebvre E 2008 J. Star. Mech. 10 10008) is one of the most widely used methods because of its good performance, especially in the big data era. In this paper we make some improvements to this algorithm in correctness and performance. By tests we see that different node orders bring different performances and different community structures. We find some node swings in different communities that influence the performance. So we design some strategies on the sweeping order of node to reduce the computing cost made by repetition swing. We introduce a new concept of overlapping degree (OV) that shows the strength of connection between nodes. Three improvement strategies are proposed that are based on constant OV, adaptive OV, and adaptive weighted OV, respectively. Experiments on synthetic datasets and real datasets are made, showing that our improved strategies can improve the performance and correctness.展开更多
An improved parallel weighted bit-flipping(PWBF) algorithm is presented. To accelerate the information exchanges between check nodes and variable nodes, the bit-flipping step and the check node updating step of the ...An improved parallel weighted bit-flipping(PWBF) algorithm is presented. To accelerate the information exchanges between check nodes and variable nodes, the bit-flipping step and the check node updating step of the original algorithm are parallelized. The simulation experiments demonstrate that the improved PWBF algorithm provides about 0. 1 to 0. 3 dB coding gain over the original PWBF algorithm. And the improved algorithm achieves a higher convergence rate. The choice of the threshold is also discussed, which is used to determine whether a bit should be flipped during each iteration. The appropriate threshold can ensure that most error bits be flipped, and keep the right ones untouched at the same time. The improvement is particularly effective for decoding quasi-cyclic low-density paritycheck(QC-LDPC) codes.展开更多
The fault diagnosis model for FMS based on multi layer feedforward neural networks was discussed An improved BP algorithm,the tactic of initial value selection based on genetic algorithm and the method of network st...The fault diagnosis model for FMS based on multi layer feedforward neural networks was discussed An improved BP algorithm,the tactic of initial value selection based on genetic algorithm and the method of network structure optimization were presented for training this model ANN(artificial neural network)fault diagnosis model for the robot in FMS was made by the new algorithm The result is superior to the rtaditional algorithm展开更多
基金the National Natural Science Foundation of China(Grant No.62101579).
文摘Offboard active decoys(OADs)can effectively jam monopulse radars.However,for missiles approaching from a particular direction and distance,the OAD should be placed at a specific location,posing high requirements for timing and deployment.To improve the response speed and jamming effect,a cluster of OADs based on an unmanned surface vehicle(USV)is proposed.The formation of the cluster determines the effectiveness of jamming.First,based on the mechanism of OAD jamming,critical conditions are identified,and a method for assessing the jamming effect is proposed.Then,for the optimization of the cluster formation,a mathematical model is built,and a multi-tribe adaptive particle swarm optimization algorithm based on mutation strategy and Metropolis criterion(3M-APSO)is designed.Finally,the formation optimization problem is solved and analyzed using the 3M-APSO algorithm under specific scenarios.The results show that the improved algorithm has a faster convergence rate and superior performance as compared to the standard Adaptive-PSO algorithm.Compared with a single OAD,the optimal formation of USV-OAD cluster effectively fills the blind area and maximizes the use of jamming resources.
基金supported by the Opening Fund of Shandong Provincial Key Laboratory of Network based Intelligent Computing,the National Natural Science Foundation of China(52205529,61803192)the Natural Science Foundation of Shandong Province(ZR2021QE195)+1 种基金the Youth Innovation Team Program of Shandong Higher Education Institution(2023KJ206)the Guangyue Youth Scholar Innovation Talent Program support received from Liaocheng University(LCUGYTD2022-03).
文摘Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.
基金supported by National Natural Science Foundation of China(71904006)Henan Province Key R&D Special Project(231111322200)+1 种基金the Science and Technology Research Plan of Henan Province(232102320043,232102320232,232102320046)the Natural Science Foundation of Henan(232300420317,232300420314).
文摘Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Firstly,the Golden Sine algorithm and a nonlinear weight factor optimization strategy were added in the discoverer position update stage of the SSA algorithm.Secondly,the Cauchy-Gaussian perturbation was applied to the optimal position of the SSA algorithm to improve its ability to jump out of local optima.Finally,the local search mechanism based on the mountain climbing method was incorporated into the local search stage of the SSA algorithm,improving its local search ability.To evaluate the effectiveness of the proposed algorithm,the Whale Algorithm,Gray Wolf Algorithm,Improved Gray Wolf Algorithm,Sparrow Search Algorithm,and MSSA Algorithm were employed to solve various test functions.The accuracy and convergence speed of each algorithm were then compared and analyzed.The results indicate that the MSSA algorithm has superior solving ability and stability compared to other algorithms.To further validate the enhanced algorithm’s capabilities for path planning,evacuation experiments were conducted using different maps featuring various obstacle types.Additionally,a multi-exit evacuation scenario was constructed according to the actual building environment of a teaching building.Both the sparrow search algorithm and MSSA algorithm were employed in the simulation experiment for multiexit evacuation path planning.The findings demonstrate that the MSSA algorithm outperforms the comparison algorithm,showcasing its greater advantages and higher application potential.
文摘The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.
基金supported by the Open Fund of Advanced Cryptography and System Security Key Laboratory of Sichuan Province(Grant No.SKLACSS-202208)the Natural Science Foundation of Chongqing(Grant No.CSTB2023NSCQLZX0139)the National Natural Science Foundation of China(Grant No.61772295).
文摘With the rapid development of digital information technology,images are increasingly used in various fields.To ensure the security of image data,prevent unauthorized tampering and leakage,maintain personal privacy,and protect intellectual property rights,this study proposes an innovative color image encryption algorithm.Initially,the Mersenne Twister algorithm is utilized to generate high-quality pseudo-random numbers,establishing a robust basis for subsequent operations.Subsequently,two distinct chaotic systems,the autonomous non-Hamiltonian chaotic system and the tentlogistic-cosine chaotic mapping,are employed to produce chaotic random sequences.These chaotic sequences are used to control the encoding and decoding process of the DNA,effectively scrambling the image pixels.Furthermore,the complexity of the encryption process is enhanced through improved Joseph block scrambling.Thorough experimental verification,research,and analysis,the average value of the information entropy test data reaches as high as 7.999.Additionally,the average value of the number of pixels change rate(NPCR)test data is 99.6101%,which closely approaches the ideal value of 99.6094%.This algorithm not only guarantees image quality but also substantially raises the difficulty of decryption.
基金supported by the Deanship of Scientific Research,at Imam Abdulrahman Bin Faisal University.Grant Number:2019-416-ASCS.
文摘Lung cancer is among the most frequent cancers in the world,with over one million deaths per year.Classification is required for lung cancer diagnosis and therapy to be effective,accurate,and reliable.Gene expression microarrays have made it possible to find genetic biomarkers for cancer diagnosis and prediction in a high-throughput manner.Machine Learning(ML)has been widely used to diagnose and classify lung cancer where the performance of ML methods is evaluated to identify the appropriate technique.Identifying and selecting the gene expression patterns can help in lung cancer diagnoses and classification.Normally,microarrays include several genes and may cause confusion or false prediction.Therefore,the Arithmetic Optimization Algorithm(AOA)is used to identify the optimal gene subset to reduce the number of selected genes.Which can allow the classifiers to yield the best performance for lung cancer classification.In addition,we proposed a modified version of AOA which can work effectively on the high dimensional dataset.In the modified AOA,the features are ranked by their weights and are used to initialize the AOA population.The exploitation process of AOA is then enhanced by developing a local search algorithm based on two neighborhood strategies.Finally,the efficiency of the proposed methods was evaluated on gene expression datasets related to Lung cancer using stratified 4-fold cross-validation.The method’s efficacy in selecting the optimal gene subset is underscored by its ability to maintain feature proportions between 10%to 25%.Moreover,the approach significantly enhances lung cancer prediction accuracy.For instance,Lung_Harvard1 achieved an accuracy of 97.5%,Lung_Harvard2 and Lung_Michigan datasets both achieved 100%,Lung_Adenocarcinoma obtained an accuracy of 88.2%,and Lung_Ontario achieved an accuracy of 87.5%.In conclusion,the results indicate the potential promise of the proposed modified AOA approach in classifying microarray cancer data.
基金the National Natural Science Foundation of China(52177074).
文摘The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.
文摘Flexible job shop scheduling problem(FJSP)is the core decision-making problem of intelligent manufacturing production management.The Harris hawk optimization(HHO)algorithm,as a typical metaheuristic algorithm,has been widely employed to solve scheduling problems.However,HHO suffers from premature convergence when solving NP-hard problems.Therefore,this paper proposes an improved HHO algorithm(GNHHO)to solve the FJSP.GNHHO introduces an elitism strategy,a chaotic mechanism,a nonlinear escaping energy update strategy,and a Gaussian random walk strategy to prevent premature convergence.A flexible job shop scheduling model is constructed,and the static and dynamic FJSP is investigated to minimize the makespan.This paper chooses a two-segment encoding mode based on the job and the machine of the FJSP.To verify the effectiveness of GNHHO,this study tests it in 23 benchmark functions,10 standard job shop scheduling problems(JSPs),and 5 standard FJSPs.Besides,this study collects data from an agricultural company and uses the GNHHO algorithm to optimize the company’s FJSP.The optimized scheduling scheme demonstrates significant improvements in makespan,with an advancement of 28.16%for static scheduling and 35.63%for dynamic scheduling.Moreover,it achieves an average increase of 21.50%in the on-time order delivery rate.The results demonstrate that the performance of the GNHHO algorithm in solving FJSP is superior to some existing algorithms.
文摘A hard problem that hinders the movement of waxy crude oil is wax deposition in oil pipelines.To ensure the safe operation of crude oil pipelines,an accurate model must be developed to predict the rate of wax deposition in crude oil pipelines.Aiming at the shortcomings of the ENN prediction model,which easily falls into the local minimum value and weak generalization ability in the implementation process,an optimized ENN prediction model based on the IRSA is proposed.The validity of the new model was confirmed by the accurate prediction of two sets of experimental data on wax deposition in crude oil pipelines.The two groups of crude oil wax deposition rate case prediction results showed that the average absolute percentage errors of IRSA-ENN prediction models is 0.5476% and 0.7831%,respectively.Additionally,it shows a higher prediction accuracy compared to the ENN prediction model.In fact,the new model established by using the IRSA to optimize ENN can optimize the initial weights and thresholds in the prediction process,which can overcome the shortcomings of the ENN prediction model,such as weak generalization ability and tendency to fall into the local minimum value,so that it has the advantages of strong implementation and high prediction accuracy.
基金Supported by the National Basic Research Program of China under Grant No 2013CB338002the National Natural Science Foundation of China under Grant No 11504430
文摘When applying Grover's algorithm to an unordered database, the probabifity of obtaining correct results usually decreases as the quantity of target increases. A four-phase improvement of Grover's algorithm is proposed to fix the deficiency, and the unitary and the phase-matching condition are also proposed. With this improved scheme, when the proportion of target is over 1/3, the probability of obtaining correct results is greater than 97.82% with only one iteration using two phases. When the computational complexity is O( √M/N), the algorithm can succeed with a probability no less than 99.63%.
基金supported by National Natural Science Foundation of China (52178422)Doctoral Research Foundation of Hubei University of Arts and Science (2059047)National College Students’Innovation and Entrepreneurship Training Program (202210519021).
文摘The dense and accurate measurement of 3D texture is helpful in evaluating the pavement function.To form dense mandatory constraints and improve matching accuracy,the traditional binocular reconstruction technology was improved threefold.First,a single moving laser line was introduced to carry out global scanning constraints on the target,which would well overcome the difficulty of installing and recognizing excessive laser lines.Second,four kinds of improved algorithms,namely,disparity replacement,superposition synthesis,subregion segmentation,and subregion segmentation centroid enhancement,were established based on different constraint mechanism.Last,the improved binocular reconstruction test device was developed to realize the dual functions of 3D texture measurement and precision self-evaluation.Results show that compared with traditional algorithms,the introduction of a single laser line scanning constraint is helpful in improving the measurement’s accuracy.Among various improved algorithms,the improvement effect of the subregion segmentation centroid enhancement method is the best.It has a good effect on both overall measurement and single pointmeasurement,which can be considered to be used in pavement function evaluation.
文摘Web quality of service (QoS) awareness requires not only the selection of specific services to complete specific tasks, but also the comprehensive quality of service of the whole web service composition. How to select the web service composition with the highest comprehensive QoS is a NP hard problem. In this paper, an improved multi population genetic algorithm is proposed. Cosine adaptive operator is added to the algorithm to avoid premature algorithm caused by improper genetic operator and the disadvantage of destroying excellent individuals in later period. Experimental results show that compared with the common genetic algorithm and multi population genetic algorithm, this algorithm has the advantages of shorter time consumption and higher accuracy, and effectively avoids the loss of effective genes in the population.
基金Supported by "973" National Fundamental Research Program (51332)
文摘Symmetric workpiece localization algorithms combine alternating optimization and linearization. The iterative variables are partitioned into two groups. Then simple optimization approaches can be employed for each subset of variables, where optimization of configuration variables is simplified as a linear least-squares problem (LSP). Convergence of current symmetric localization algorithms is discussed firstly. It is shown that simply taking the solution of the LSP as start of the next iteration may result in divergence or incorrect convergence. Therefore in our enhanced algorithms, line search is performed along the solution of the LSP in order to find a better point reducing the value of objective function. We choose this point as start of the next iteration. Better convergence is verified by numerical simulation. Besides, imposing boundary constraints on the LSP proves to be another efficient way.
基金Project supported by the Major State Basic Research Development Program of China (Grant Nos.2013CB329602 and 2012CB316303)the Science Research Foundation for the Returned Overseas Chinese Scholars,China (Grant No.2010-31)+1 种基金the International Collaborative Project of Shanxi Province,China (Grant No.2011081034)the National Natural Science Foundation of China (Grant Nos.61232010 and 61202215)
文摘Community detection methods have been used in computer, sociology, physics, biology, and brain information science areas. Many methods are based on the optimization of modularity. The algorithm proposed by Blondel et al. (Blondel V D, Guillaume J L, Lambiotte R and Lefebvre E 2008 J. Star. Mech. 10 10008) is one of the most widely used methods because of its good performance, especially in the big data era. In this paper we make some improvements to this algorithm in correctness and performance. By tests we see that different node orders bring different performances and different community structures. We find some node swings in different communities that influence the performance. So we design some strategies on the sweeping order of node to reduce the computing cost made by repetition swing. We introduce a new concept of overlapping degree (OV) that shows the strength of connection between nodes. Three improvement strategies are proposed that are based on constant OV, adaptive OV, and adaptive weighted OV, respectively. Experiments on synthetic datasets and real datasets are made, showing that our improved strategies can improve the performance and correctness.
基金The National High Technology Research and Development Program of China (863Program) ( No2009AA01Z235,2006AA01Z263)the Research Fund of the National Mobile Communications Research Laboratory of Southeast University(No2008A10)
文摘An improved parallel weighted bit-flipping(PWBF) algorithm is presented. To accelerate the information exchanges between check nodes and variable nodes, the bit-flipping step and the check node updating step of the original algorithm are parallelized. The simulation experiments demonstrate that the improved PWBF algorithm provides about 0. 1 to 0. 3 dB coding gain over the original PWBF algorithm. And the improved algorithm achieves a higher convergence rate. The choice of the threshold is also discussed, which is used to determine whether a bit should be flipped during each iteration. The appropriate threshold can ensure that most error bits be flipped, and keep the right ones untouched at the same time. The improvement is particularly effective for decoding quasi-cyclic low-density paritycheck(QC-LDPC) codes.
文摘The fault diagnosis model for FMS based on multi layer feedforward neural networks was discussed An improved BP algorithm,the tactic of initial value selection based on genetic algorithm and the method of network structure optimization were presented for training this model ANN(artificial neural network)fault diagnosis model for the robot in FMS was made by the new algorithm The result is superior to the rtaditional algorithm