Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key de...Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key degeneration and slow evolution within populations.These challenges significantly hinder key recovery efforts.This paper proposes a screening correlation power analysis framework combined with a genetic algorithm,named SFGA-CPA,to address these issues.SFGA-CPA introduces three operations designed to exploit CPA characteris-tics:propagative operation,constrained crossover,and constrained mutation.Firstly,the propagative operation accelerates population evolution by maximizing the number of correct bytes in each individual.Secondly,the constrained crossover and mutation operations effectively address key degeneration by preventing the compromise of correct bytes.Finally,an intelligent search method is proposed to identify optimal parameters,further improving attack efficiency.Experiments were conducted on both simulated environments and real power traces collected from the SAKURA-G platform.In the case of simulation,SFGA-CPA reduces the number of traces by 27.3%and 60%compared to CPA based on multiple screening methods(MS-CPA)and CPA based on simple GA method(SGA-CPA)when the success rate reaches 90%.Moreover,real experimental results on the SAKURA-G platform demonstrate that our approach outperforms other methods.展开更多
When designing solar systems and assessing the effectiveness of their many uses,estimating sun irradiance is a crucial first step.This study examined three approaches(ANN,GA-ANN,and ANFIS)for estimating daily global s...When designing solar systems and assessing the effectiveness of their many uses,estimating sun irradiance is a crucial first step.This study examined three approaches(ANN,GA-ANN,and ANFIS)for estimating daily global solar radiation(GSR)in the south of Algeria:Adrar,Ouargla,and Bechar.The proposed hybrid GA-ANN model,based on genetic algorithm-based optimization,was developed to improve the ANN model.The GA-ANN and ANFIS models performed better than the standalone ANN-based model,with GA-ANN being better suited for forecasting in all sites,and it performed the best with the best values in the testing phase of Coefficient of Determination(R=0.9005),Mean Absolute Percentage Error(MAPE=8.40%),and Relative Root Mean Square Error(rRMSE=12.56%).Nevertheless,the ANFIS model outperformed the GA-ANN model in forecasting daily GSR,with the best values of indicators when testing the model being R=0.9374,MAPE=7.78%,and rRMSE=10.54%.Generally,we may conclude that the initial ANN stand-alone model performance when forecasting solar radiation has been improved,and the results obtained after injecting the genetic algorithm into the ANN to optimize its weights were satisfactory.The model can be used to forecast daily GSR in dry climates and other climates and may also be helpful in selecting solar energy system installations and sizes.展开更多
Side lobe level reduction(SLL)of antenna arrays significantly enhances the signal-to-interference ratio and improves the quality of service(QOS)in recent and future wireless communication systems starting from 5G up t...Side lobe level reduction(SLL)of antenna arrays significantly enhances the signal-to-interference ratio and improves the quality of service(QOS)in recent and future wireless communication systems starting from 5G up to 7G.Furthermore,it improves the array gain and directivity,increasing the detection range and angular resolution of radar systems.This study proposes two highly efficient SLL reduction techniques.These techniques are based on the hybridization between either the single convolution or the double convolution algorithms and the genetic algorithm(GA)to develop the Conv/GA andDConv/GA,respectively.The convolution process determines the element’s excitations while the GA optimizes the element spacing.For M elements linear antenna array(LAA),the convolution of the excitation coefficients vector by itself provides a new vector of excitations of length N=(2M−1).This new vector is divided into three different sets of excitations including the odd excitations,even excitations,and middle excitations of lengths M,M−1,andM,respectively.When the same element spacing as the original LAA is used,it is noticed that the odd and even excitations provide a much lower SLL than that of the LAA but with amuch wider half-power beamwidth(HPBW).While the middle excitations give the same HPBWas the original LAA with a relatively higher SLL.Tomitigate the increased HPBWof the odd and even excitations,the element spacing is optimized using the GA.Thereby,the synthesized arrays have the same HPBW as the original LAA with a two-fold reduction in the SLL.Furthermore,for extreme SLL reduction,the DConv/GA is introduced.In this technique,the same procedure of the aforementioned Conv/GA technique is performed on the resultant even and odd excitation vectors.It provides a relatively wider HPBWthan the original LAA with about quad-fold reduction in the SLL.展开更多
Evolutionary algorithms(EAs)have been used in high utility itemset mining(HUIM)to address the problem of discover-ing high utility itemsets(HUIs)in the exponential search space.EAs have good running and mining perform...Evolutionary algorithms(EAs)have been used in high utility itemset mining(HUIM)to address the problem of discover-ing high utility itemsets(HUIs)in the exponential search space.EAs have good running and mining performance,but they still require huge computational resource and may miss many HUIs.Due to the good combination of EA and graphics processing unit(GPU),we propose a parallel genetic algorithm(GA)based on the platform of GPU for mining HUIM(PHUI-GA).The evolution steps with improvements are performed in central processing unit(CPU)and the CPU intensive steps are sent to GPU to eva-luate with multi-threaded processors.Experiments show that the mining performance of PHUI-GA outperforms the existing EAs.When mining 90%HUIs,the PHUI-GA is up to 188 times better than the existing EAs and up to 36 times better than the CPU parallel approach.展开更多
One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operati...One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.展开更多
This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic ...This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic Algorithm(GA).MOALO version has been employed to address those problems containing many objectives and an archive has been employed for retaining the non-dominated solutions.The uniqueness of the hybrid is that the operators like mutation and crossover of GA are employed in the archive to update the solutions and later those solutions go through the process of MOALO.A first-time hybrid of these algorithms is employed to solve multi-objective problems.The hybrid algorithm overcomes the limitation of ALO of getting caught in the local optimum and the requirement of more computational effort to converge GA.To evaluate the hybridized algorithm’s performance,a set of constrained,unconstrained test problems and engineering design problems were employed and compared with five well-known computational algorithms-MOALO,Multi-objective Crystal Structure Algorithm(MOCryStAl),Multi-objective Particle Swarm Optimization(MOPSO),Multi-objective Multiverse Optimization Algorithm(MOMVO),Multi-objective Salp Swarm Algorithm(MSSA).The outcomes of five performance metrics are statistically analyzed and the most efficient Pareto fronts comparison has been obtained.The proposed hybrid surpasses MOALO based on the results of hypervolume(HV),Spread,and Spacing.So primary objective of developing this hybrid approach has been achieved successfully.The proposed approach demonstrates superior performance on the test functions,showcasing robust convergence and comprehensive coverage that surpasses other existing algorithms.展开更多
Surface wave inversion is a key step in the application of surface waves to soil velocity profiling.Currently,a common practice for the process of inversion is that the number of soil layers is assumed to be known bef...Surface wave inversion is a key step in the application of surface waves to soil velocity profiling.Currently,a common practice for the process of inversion is that the number of soil layers is assumed to be known before using heuristic search algorithms to compute the shear wave velocity profile or the number of soil layers is considered as an optimization variable.However,an improper selection of the number of layers may lead to an incorrect shear wave velocity profile.In this study,a deep learning and genetic algorithm hybrid learning procedure is proposed to perform the surface wave inversion without the need to assume the number of soil layers.First,a deep neural network is adapted to learn from a large number of synthetic dispersion curves for inferring the layer number.Then,the shear-wave velocity profile is determined by a genetic algorithm with the known layer number.By applying this procedure to both simulated and real-world cases,the results indicate that the proposed method is reliable and efficient for surface wave inversion.展开更多
Magnetic field design is essential for the operation of Hall thrusters.This study focuses on utilizing a genetic algorithm to optimize the magnetic field configuration of SPT70.A 2D hybrid PIC-DSMC and channel-wall er...Magnetic field design is essential for the operation of Hall thrusters.This study focuses on utilizing a genetic algorithm to optimize the magnetic field configuration of SPT70.A 2D hybrid PIC-DSMC and channel-wall erosion model are employed to analyze the plume divergence angle and wall erosion rate,while a Farady probe measurement and laser profilometry system are set up to verify the simulation results.The results demonstrate that the genetic algorithm contributes to reducing the divergence angle of the thruster plumes and alleviating the impact of high-energy particles on the discharge channel wall,reducing the erosion by 5.5%and 2.7%,respectively.Further analysis indicates that the change from a divergent magnetic field to a convergent magnetic field,combined with the upstream shift of the ionization region,contributes to the improving the operation of the Hall thruster.展开更多
Genetic algorithms(GAs)are very good metaheuristic algorithms that are suitable for solving NP-hard combinatorial optimization problems.AsimpleGAbeginswith a set of solutions represented by a population of chromosomes...Genetic algorithms(GAs)are very good metaheuristic algorithms that are suitable for solving NP-hard combinatorial optimization problems.AsimpleGAbeginswith a set of solutions represented by a population of chromosomes and then uses the idea of survival of the fittest in the selection process to select some fitter chromosomes.It uses a crossover operator to create better offspring chromosomes and thus,converges the population.Also,it uses a mutation operator to explore the unexplored areas by the crossover operator,and thus,diversifies the GA search space.A combination of crossover and mutation operators makes the GA search strong enough to reach the optimal solution.However,appropriate selection and combination of crossover operator and mutation operator can lead to a very good GA for solving an optimization problem.In this present paper,we aim to study the benchmark traveling salesman problem(TSP).We developed several genetic algorithms using seven crossover operators and six mutation operators for the TSP and then compared them to some benchmark TSPLIB instances.The experimental studies show the effectiveness of the combination of a comprehensive sequential constructive crossover operator and insertion mutation operator for the problem.The GA using the comprehensive sequential constructive crossover with insertion mutation could find average solutions whose average percentage of excesses from the best-known solutions are between 0.22 and 14.94 for our experimented problem instances.展开更多
The job shop scheduling problem is a classical combinatorial optimization challenge frequently encountered in manufacturing systems.It involves determining the optimal execution sequences for a set of jobs on various ...The job shop scheduling problem is a classical combinatorial optimization challenge frequently encountered in manufacturing systems.It involves determining the optimal execution sequences for a set of jobs on various machines to maximize production efficiency and meet multiple objectives.The Non-dominated Sorting Genetic Algorithm Ⅲ(NSGA-Ⅲ)is an effective approach for solving the multi-objective job shop scheduling problem.Nevertheless,it has some limitations in solving scheduling problems,including inadequate global search capability,susceptibility to premature convergence,and challenges in balancing convergence and diversity.To enhance its performance,this paper introduces a strengthened dominance relation NSGA-Ⅲ algorithm based on differential evolution(NSGA-Ⅲ-SD).By incorporating constrained differential evolution and simulated binary crossover genetic operators,this algorithm effectively improves NSGA-Ⅲ’s global search capability while mitigating pre-mature convergence issues.Furthermore,it introduces a reinforced dominance relation to address the trade-off between convergence and diversity in NSGA-Ⅲ.Additionally,effective encoding and decoding methods for discrete job shop scheduling are proposed,which can improve the overall performance of the algorithm without complex computation.To validate the algorithm’s effectiveness,NSGA-Ⅲ-SD is extensively compared with other advanced multi-objective optimization algorithms using 20 job shop scheduling test instances.The experimental results demonstrate that NSGA-Ⅲ-SD achieves better solution quality and diversity,proving its effectiveness in solving the multi-objective job shop scheduling problem.展开更多
Accurate prediction of the movement trajectory of sea surface targets holds significant importance in achieving an advantageous position in the sea battle field.This prediction plays a crucial role in ensuring securit...Accurate prediction of the movement trajectory of sea surface targets holds significant importance in achieving an advantageous position in the sea battle field.This prediction plays a crucial role in ensuring security defense and confrontation,and is essential for effective deployment of military strategy.Accurately predicting the trajectory of sea surface targets using AIS(Automatic Identification System)information is crucial for security defense and confrontation,and holds significant importance for military strategy deployment.In response to the problem of insufficient accuracy in ship trajectory prediction,this study proposes a hybrid genetic algorithm to optimize the Long Short-Term Memory(LSTM)algorithm.The HGA-LSTM algorithm is proposed for ship trajectory prediction.It can converge faster and obtain better parameter solutions,thereby improving the effectiveness of ship trajectory prediction.Compared to traditional LSTM and GA-LSTM algorithms,experimental results demonstrate that this algorithm outperforms them in both single-step and multi-step prediction.展开更多
The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is h...The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is highly expensive,we will develop genetic algorithms(GAs)to obtain heuristic solutions to the problem.In GAs,as the crossover is a very important process,the crossovermethods proposed for the traditional TSP could be adapted for the GTSP.The sequential constructive crossover(SCX)and three other operators are adapted to use in GAs to solve the GTSP.The effectiveness of GA using SCX is verified on some GTSP Library(GTSPLIB)instances first and then compared against GAs using the other crossover methods.The computational results show the success of the GA using SCX for this problem.Our proposed GA using SCX,and swap mutation could find average solutions whose average percentage of excesses fromthe best-known solutions is between 0.00 and 14.07 for our investigated instances.展开更多
Background:Diabetic nephropathy(DN)is the most common complication of type 2 diabetes mellitus and the main cause of end-stage renal disease worldwide.Diagnostic biomarkers may allow early diagnosis and treatment of D...Background:Diabetic nephropathy(DN)is the most common complication of type 2 diabetes mellitus and the main cause of end-stage renal disease worldwide.Diagnostic biomarkers may allow early diagnosis and treatment of DN to reduce the prevalence and delay the development of DN.Kidney biopsy is the gold standard for diagnosing DN;however,its invasive character is its primary limitation.The machine learning approach provides a non-invasive and specific criterion for diagnosing DN,although traditional machine learning algorithms need to be improved to enhance diagnostic performance.Methods:We applied high-throughput RNA sequencing to obtain the genes related to DN tubular tissues and normal tubular tissues of mice.Then machine learning algorithms,random forest,LASSO logistic regression,and principal component analysis were used to identify key genes(CES1G,CYP4A14,NDUFA4,ABCC4,ACE).Then,the genetic algorithm-optimized backpropagation neural network(GA-BPNN)was used to improve the DN diagnostic model.Results:The AUC value of the GA-BPNN model in the training dataset was 0.83,and the AUC value of the model in the validation dataset was 0.81,while the AUC values of the SVM model in the training dataset and external validation dataset were 0.756 and 0.650,respectively.Thus,this GA-BPNN gave better values than the traditional SVM model.This diagnosis model may aim for personalized diagnosis and treatment of patients with DN.Immunohistochemical staining further confirmed that the tissue and cell expression of NADH dehydrogenase(ubiquinone)1 alpha subcomplex,4-like 2(NDUFA4L2)in tubular tissue in DN mice were decreased.Conclusion:The GA-BPNN model has better accuracy than the traditional SVM model and may provide an effective tool for diagnosing DN.展开更多
The gamma-graphyne nanoribbons(γ-GYNRs) incorporating diamond-shaped segment(DSSs) with excellent thermoelectric properties are systematically investigated by combining nonequilibrium Green’s functions with adaptive...The gamma-graphyne nanoribbons(γ-GYNRs) incorporating diamond-shaped segment(DSSs) with excellent thermoelectric properties are systematically investigated by combining nonequilibrium Green’s functions with adaptive genetic algorithm. Our calculations show that the adaptive genetic algorithm is efficient and accurate in the process of identifying structures with excellent thermoelectric performance. In multiple rounds, an average of 476 candidates(only 2.88% of all16512 candidate structures) are calculated to obtain the structures with extremely high thermoelectric conversion efficiency.The room temperature thermoelectric figure of merit(ZT) of the optimal γ-GYNR incorporating DSSs is 1.622, which is about 5.4 times higher than that of pristine γ-GYNR(length 23.693 nm and width 2.660 nm). The significant improvement of thermoelectric performance of the optimal γ-GYNR is mainly attributed to the maximum balance of inhibition of thermal conductance(proactive effect) and reduction of thermal power factor(side effect). Moreover, through exploration of the main variables affecting the genetic algorithm, it is revealed that the efficiency of the genetic algorithm can be improved by optimizing the initial population gene pool, selecting a higher individual retention rate and a lower mutation rate. The results presented in this paper validate the effectiveness of genetic algorithm in accelerating the exploration of γ-GYNRs with high thermoelectric conversion efficiency, and could provide a new development solution for carbon-based thermoelectric materials.展开更多
Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems....Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.展开更多
With the widespread use of the internet,there is an increasing need to ensure the security and privacy of transmitted data.This has led to an intensified focus on the study of video steganography,which is a technique ...With the widespread use of the internet,there is an increasing need to ensure the security and privacy of transmitted data.This has led to an intensified focus on the study of video steganography,which is a technique that hides data within a video cover to avoid detection.The effectiveness of any steganography method depends on its ability to embed data without altering the original video’s quality while maintaining high efficiency.This paper proposes a new method to video steganography,which involves utilizing a Genetic Algorithm(GA)for identifying the Region of Interest(ROI)in the cover video.The ROI is the area in the video that is the most suitable for data embedding.The secret data is encrypted using the Advanced Encryption Standard(AES),which is a widely accepted encryption standard,before being embedded into the cover video,utilizing up to 10%of the cover video.This process ensures the security and confidentiality of the embedded data.The performance metrics for assessing the proposed method are the Peak Signalto-Noise Ratio(PSNR)and the encoding and decoding time.The results show that the proposed method has a high embedding capacity and efficiency,with a PSNR ranging between 64 and 75 dBs,which indicates that the embedded data is almost indistinguishable from the original video.Additionally,the method can encode and decode data quickly,making it efficient for real-time applications.展开更多
Numerous clothing enterprises in the market have a relatively low efficiency of assembly line planning due to insufficient optimization of bottleneck stations.As a result,the production efficiency of the enterprise is...Numerous clothing enterprises in the market have a relatively low efficiency of assembly line planning due to insufficient optimization of bottleneck stations.As a result,the production efficiency of the enterprise is not high,and the production organization is not up to expectations.Aiming at the problem of flexible process route planning in garment workshops,a multi-object genetic algorithm is proposed to solve the assembly line bal-ance optimization problem and minimize the machine adjustment path.The encoding method adopts the object-oriented path representation method,and the initial population is generated by random topology sorting based on an in-degree selection mechanism.The multi-object genetic algorithm improves the mutation and crossover operations according to the characteristics of the clothing process to avoid the generation of invalid offspring.In the iterative process,the bottleneck station is optimized by reasonable process splitting,and process allocation conforms to the strict limit of the station on the number of machines in order to improve the compilation efficiency.The effectiveness and feasibility of the multi-object genetic algorithm are proven by the analysis of clothing cases.Compared with the artificial allocation process,the compilation efficiency of MOGA is increased by more than 15%and completes the optimization of the minimum machine adjustment path.The results are in line with the expected optimization effect.展开更多
基金supported by the Hunan Provincial Natrual Science Foundation of China(2022JJ30103)“the 14th Five-Year”Key Disciplines and Application Oriented Special Disciplines of Hunan Province(Xiangjiaotong[2022],351)the Science and Technology Innovation Program of Hunan Province(2016TP1020).
文摘Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key degeneration and slow evolution within populations.These challenges significantly hinder key recovery efforts.This paper proposes a screening correlation power analysis framework combined with a genetic algorithm,named SFGA-CPA,to address these issues.SFGA-CPA introduces three operations designed to exploit CPA characteris-tics:propagative operation,constrained crossover,and constrained mutation.Firstly,the propagative operation accelerates population evolution by maximizing the number of correct bytes in each individual.Secondly,the constrained crossover and mutation operations effectively address key degeneration by preventing the compromise of correct bytes.Finally,an intelligent search method is proposed to identify optimal parameters,further improving attack efficiency.Experiments were conducted on both simulated environments and real power traces collected from the SAKURA-G platform.In the case of simulation,SFGA-CPA reduces the number of traces by 27.3%and 60%compared to CPA based on multiple screening methods(MS-CPA)and CPA based on simple GA method(SGA-CPA)when the success rate reaches 90%.Moreover,real experimental results on the SAKURA-G platform demonstrate that our approach outperforms other methods.
文摘When designing solar systems and assessing the effectiveness of their many uses,estimating sun irradiance is a crucial first step.This study examined three approaches(ANN,GA-ANN,and ANFIS)for estimating daily global solar radiation(GSR)in the south of Algeria:Adrar,Ouargla,and Bechar.The proposed hybrid GA-ANN model,based on genetic algorithm-based optimization,was developed to improve the ANN model.The GA-ANN and ANFIS models performed better than the standalone ANN-based model,with GA-ANN being better suited for forecasting in all sites,and it performed the best with the best values in the testing phase of Coefficient of Determination(R=0.9005),Mean Absolute Percentage Error(MAPE=8.40%),and Relative Root Mean Square Error(rRMSE=12.56%).Nevertheless,the ANFIS model outperformed the GA-ANN model in forecasting daily GSR,with the best values of indicators when testing the model being R=0.9374,MAPE=7.78%,and rRMSE=10.54%.Generally,we may conclude that the initial ANN stand-alone model performance when forecasting solar radiation has been improved,and the results obtained after injecting the genetic algorithm into the ANN to optimize its weights were satisfactory.The model can be used to forecast daily GSR in dry climates and other climates and may also be helpful in selecting solar energy system installations and sizes.
基金Research Supporting Project Number(RSPD2023R 585),King Saud University,Riyadh,Saudi Arabia.
文摘Side lobe level reduction(SLL)of antenna arrays significantly enhances the signal-to-interference ratio and improves the quality of service(QOS)in recent and future wireless communication systems starting from 5G up to 7G.Furthermore,it improves the array gain and directivity,increasing the detection range and angular resolution of radar systems.This study proposes two highly efficient SLL reduction techniques.These techniques are based on the hybridization between either the single convolution or the double convolution algorithms and the genetic algorithm(GA)to develop the Conv/GA andDConv/GA,respectively.The convolution process determines the element’s excitations while the GA optimizes the element spacing.For M elements linear antenna array(LAA),the convolution of the excitation coefficients vector by itself provides a new vector of excitations of length N=(2M−1).This new vector is divided into three different sets of excitations including the odd excitations,even excitations,and middle excitations of lengths M,M−1,andM,respectively.When the same element spacing as the original LAA is used,it is noticed that the odd and even excitations provide a much lower SLL than that of the LAA but with amuch wider half-power beamwidth(HPBW).While the middle excitations give the same HPBWas the original LAA with a relatively higher SLL.Tomitigate the increased HPBWof the odd and even excitations,the element spacing is optimized using the GA.Thereby,the synthesized arrays have the same HPBW as the original LAA with a two-fold reduction in the SLL.Furthermore,for extreme SLL reduction,the DConv/GA is introduced.In this technique,the same procedure of the aforementioned Conv/GA technique is performed on the resultant even and odd excitation vectors.It provides a relatively wider HPBWthan the original LAA with about quad-fold reduction in the SLL.
基金This work was supported by the National Natural Science Foundation of China(62073155,62002137,62106088,62206113)the High-End Foreign Expert Recruitment Plan(G2023144007L)the Fundamental Research Funds for the Central Universities(JUSRP221028).
文摘Evolutionary algorithms(EAs)have been used in high utility itemset mining(HUIM)to address the problem of discover-ing high utility itemsets(HUIs)in the exponential search space.EAs have good running and mining performance,but they still require huge computational resource and may miss many HUIs.Due to the good combination of EA and graphics processing unit(GPU),we propose a parallel genetic algorithm(GA)based on the platform of GPU for mining HUIM(PHUI-GA).The evolution steps with improvements are performed in central processing unit(CPU)and the CPU intensive steps are sent to GPU to eva-luate with multi-threaded processors.Experiments show that the mining performance of PHUI-GA outperforms the existing EAs.When mining 90%HUIs,the PHUI-GA is up to 188 times better than the existing EAs and up to 36 times better than the CPU parallel approach.
文摘One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.
基金supported by the National Research Foundation of Korea(NRF)Grant funded by the Korea government(MSIT)(No.RS-2023-00218176)the Soonchunhyang University Research Fund.
文摘This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic Algorithm(GA).MOALO version has been employed to address those problems containing many objectives and an archive has been employed for retaining the non-dominated solutions.The uniqueness of the hybrid is that the operators like mutation and crossover of GA are employed in the archive to update the solutions and later those solutions go through the process of MOALO.A first-time hybrid of these algorithms is employed to solve multi-objective problems.The hybrid algorithm overcomes the limitation of ALO of getting caught in the local optimum and the requirement of more computational effort to converge GA.To evaluate the hybridized algorithm’s performance,a set of constrained,unconstrained test problems and engineering design problems were employed and compared with five well-known computational algorithms-MOALO,Multi-objective Crystal Structure Algorithm(MOCryStAl),Multi-objective Particle Swarm Optimization(MOPSO),Multi-objective Multiverse Optimization Algorithm(MOMVO),Multi-objective Salp Swarm Algorithm(MSSA).The outcomes of five performance metrics are statistically analyzed and the most efficient Pareto fronts comparison has been obtained.The proposed hybrid surpasses MOALO based on the results of hypervolume(HV),Spread,and Spacing.So primary objective of developing this hybrid approach has been achieved successfully.The proposed approach demonstrates superior performance on the test functions,showcasing robust convergence and comprehensive coverage that surpasses other existing algorithms.
基金provided through research grant No.0035/2019/A1 from the Science and Technology Development Fund,Macao SARthe assistantship from the Faculty of Science and Technology,University of Macao。
文摘Surface wave inversion is a key step in the application of surface waves to soil velocity profiling.Currently,a common practice for the process of inversion is that the number of soil layers is assumed to be known before using heuristic search algorithms to compute the shear wave velocity profile or the number of soil layers is considered as an optimization variable.However,an improper selection of the number of layers may lead to an incorrect shear wave velocity profile.In this study,a deep learning and genetic algorithm hybrid learning procedure is proposed to perform the surface wave inversion without the need to assume the number of soil layers.First,a deep neural network is adapted to learn from a large number of synthetic dispersion curves for inferring the layer number.Then,the shear-wave velocity profile is determined by a genetic algorithm with the known layer number.By applying this procedure to both simulated and real-world cases,the results indicate that the proposed method is reliable and efficient for surface wave inversion.
基金funded by Shanghai Natural Science Foundation(No.12ZR1414700)。
文摘Magnetic field design is essential for the operation of Hall thrusters.This study focuses on utilizing a genetic algorithm to optimize the magnetic field configuration of SPT70.A 2D hybrid PIC-DSMC and channel-wall erosion model are employed to analyze the plume divergence angle and wall erosion rate,while a Farady probe measurement and laser profilometry system are set up to verify the simulation results.The results demonstrate that the genetic algorithm contributes to reducing the divergence angle of the thruster plumes and alleviating the impact of high-energy particles on the discharge channel wall,reducing the erosion by 5.5%and 2.7%,respectively.Further analysis indicates that the change from a divergent magnetic field to a convergent magnetic field,combined with the upstream shift of the ionization region,contributes to the improving the operation of the Hall thruster.
基金the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(Grant Number IMSIU-RP23030).
文摘Genetic algorithms(GAs)are very good metaheuristic algorithms that are suitable for solving NP-hard combinatorial optimization problems.AsimpleGAbeginswith a set of solutions represented by a population of chromosomes and then uses the idea of survival of the fittest in the selection process to select some fitter chromosomes.It uses a crossover operator to create better offspring chromosomes and thus,converges the population.Also,it uses a mutation operator to explore the unexplored areas by the crossover operator,and thus,diversifies the GA search space.A combination of crossover and mutation operators makes the GA search strong enough to reach the optimal solution.However,appropriate selection and combination of crossover operator and mutation operator can lead to a very good GA for solving an optimization problem.In this present paper,we aim to study the benchmark traveling salesman problem(TSP).We developed several genetic algorithms using seven crossover operators and six mutation operators for the TSP and then compared them to some benchmark TSPLIB instances.The experimental studies show the effectiveness of the combination of a comprehensive sequential constructive crossover operator and insertion mutation operator for the problem.The GA using the comprehensive sequential constructive crossover with insertion mutation could find average solutions whose average percentage of excesses from the best-known solutions are between 0.22 and 14.94 for our experimented problem instances.
基金in part supported by the Key Research and Development Project of Hubei Province(Nos.2020BAB1141,2023BAB094)the Key Project of Science and Technology Research ProgramofHubei Educational Committee(No.D20211402)+1 种基金the Teaching Research Project of Hubei University of Technology(No.XIAO2018001)the Project of Xiangyang Industrial Research Institute of Hubei University of Technology(No.XYYJ2022C04).
文摘The job shop scheduling problem is a classical combinatorial optimization challenge frequently encountered in manufacturing systems.It involves determining the optimal execution sequences for a set of jobs on various machines to maximize production efficiency and meet multiple objectives.The Non-dominated Sorting Genetic Algorithm Ⅲ(NSGA-Ⅲ)is an effective approach for solving the multi-objective job shop scheduling problem.Nevertheless,it has some limitations in solving scheduling problems,including inadequate global search capability,susceptibility to premature convergence,and challenges in balancing convergence and diversity.To enhance its performance,this paper introduces a strengthened dominance relation NSGA-Ⅲ algorithm based on differential evolution(NSGA-Ⅲ-SD).By incorporating constrained differential evolution and simulated binary crossover genetic operators,this algorithm effectively improves NSGA-Ⅲ’s global search capability while mitigating pre-mature convergence issues.Furthermore,it introduces a reinforced dominance relation to address the trade-off between convergence and diversity in NSGA-Ⅲ.Additionally,effective encoding and decoding methods for discrete job shop scheduling are proposed,which can improve the overall performance of the algorithm without complex computation.To validate the algorithm’s effectiveness,NSGA-Ⅲ-SD is extensively compared with other advanced multi-objective optimization algorithms using 20 job shop scheduling test instances.The experimental results demonstrate that NSGA-Ⅲ-SD achieves better solution quality and diversity,proving its effectiveness in solving the multi-objective job shop scheduling problem.
文摘Accurate prediction of the movement trajectory of sea surface targets holds significant importance in achieving an advantageous position in the sea battle field.This prediction plays a crucial role in ensuring security defense and confrontation,and is essential for effective deployment of military strategy.Accurately predicting the trajectory of sea surface targets using AIS(Automatic Identification System)information is crucial for security defense and confrontation,and holds significant importance for military strategy deployment.In response to the problem of insufficient accuracy in ship trajectory prediction,this study proposes a hybrid genetic algorithm to optimize the Long Short-Term Memory(LSTM)algorithm.The HGA-LSTM algorithm is proposed for ship trajectory prediction.It can converge faster and obtain better parameter solutions,thereby improving the effectiveness of ship trajectory prediction.Compared to traditional LSTM and GA-LSTM algorithms,experimental results demonstrate that this algorithm outperforms them in both single-step and multi-step prediction.
基金the Deanship of Scientific Research,Imam Mohammad Ibn Saud Islamic University(IMSIU),Saudi Arabia,for funding this research work through Grant No.(221412020).
文摘The generalized travelling salesman problem(GTSP),a generalization of the well-known travelling salesman problem(TSP),is considered for our study.Since the GTSP is NP-hard and very complex,finding exact solutions is highly expensive,we will develop genetic algorithms(GAs)to obtain heuristic solutions to the problem.In GAs,as the crossover is a very important process,the crossovermethods proposed for the traditional TSP could be adapted for the GTSP.The sequential constructive crossover(SCX)and three other operators are adapted to use in GAs to solve the GTSP.The effectiveness of GA using SCX is verified on some GTSP Library(GTSPLIB)instances first and then compared against GAs using the other crossover methods.The computational results show the success of the GA using SCX for this problem.Our proposed GA using SCX,and swap mutation could find average solutions whose average percentage of excesses fromthe best-known solutions is between 0.00 and 14.07 for our investigated instances.
基金the National Natural Science Foundation of China(Grant Number:81970631 to W.L.).
文摘Background:Diabetic nephropathy(DN)is the most common complication of type 2 diabetes mellitus and the main cause of end-stage renal disease worldwide.Diagnostic biomarkers may allow early diagnosis and treatment of DN to reduce the prevalence and delay the development of DN.Kidney biopsy is the gold standard for diagnosing DN;however,its invasive character is its primary limitation.The machine learning approach provides a non-invasive and specific criterion for diagnosing DN,although traditional machine learning algorithms need to be improved to enhance diagnostic performance.Methods:We applied high-throughput RNA sequencing to obtain the genes related to DN tubular tissues and normal tubular tissues of mice.Then machine learning algorithms,random forest,LASSO logistic regression,and principal component analysis were used to identify key genes(CES1G,CYP4A14,NDUFA4,ABCC4,ACE).Then,the genetic algorithm-optimized backpropagation neural network(GA-BPNN)was used to improve the DN diagnostic model.Results:The AUC value of the GA-BPNN model in the training dataset was 0.83,and the AUC value of the model in the validation dataset was 0.81,while the AUC values of the SVM model in the training dataset and external validation dataset were 0.756 and 0.650,respectively.Thus,this GA-BPNN gave better values than the traditional SVM model.This diagnosis model may aim for personalized diagnosis and treatment of patients with DN.Immunohistochemical staining further confirmed that the tissue and cell expression of NADH dehydrogenase(ubiquinone)1 alpha subcomplex,4-like 2(NDUFA4L2)in tubular tissue in DN mice were decreased.Conclusion:The GA-BPNN model has better accuracy than the traditional SVM model and may provide an effective tool for diagnosing DN.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.11974300,11974299,12074150)the Natural Science Foundation of Hunan Province,China(Grant No.2021JJ30645)+3 种基金Scientific Research Fund of Hunan Provincial Education Department(Grant Nos.20K127,20A503,and 20B582)Program for Changjiang Scholars and Innovative Research Team in University(Grant No.IRT13093)the Hunan Provincial Innovation Foundation for Postgraduate(Grant No.CX20220544)Youth Science and Technology Talent Project of Hunan Province,China(Grant No.2022RC1197)。
文摘The gamma-graphyne nanoribbons(γ-GYNRs) incorporating diamond-shaped segment(DSSs) with excellent thermoelectric properties are systematically investigated by combining nonequilibrium Green’s functions with adaptive genetic algorithm. Our calculations show that the adaptive genetic algorithm is efficient and accurate in the process of identifying structures with excellent thermoelectric performance. In multiple rounds, an average of 476 candidates(only 2.88% of all16512 candidate structures) are calculated to obtain the structures with extremely high thermoelectric conversion efficiency.The room temperature thermoelectric figure of merit(ZT) of the optimal γ-GYNR incorporating DSSs is 1.622, which is about 5.4 times higher than that of pristine γ-GYNR(length 23.693 nm and width 2.660 nm). The significant improvement of thermoelectric performance of the optimal γ-GYNR is mainly attributed to the maximum balance of inhibition of thermal conductance(proactive effect) and reduction of thermal power factor(side effect). Moreover, through exploration of the main variables affecting the genetic algorithm, it is revealed that the efficiency of the genetic algorithm can be improved by optimizing the initial population gene pool, selecting a higher individual retention rate and a lower mutation rate. The results presented in this paper validate the effectiveness of genetic algorithm in accelerating the exploration of γ-GYNRs with high thermoelectric conversion efficiency, and could provide a new development solution for carbon-based thermoelectric materials.
文摘Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.
文摘With the widespread use of the internet,there is an increasing need to ensure the security and privacy of transmitted data.This has led to an intensified focus on the study of video steganography,which is a technique that hides data within a video cover to avoid detection.The effectiveness of any steganography method depends on its ability to embed data without altering the original video’s quality while maintaining high efficiency.This paper proposes a new method to video steganography,which involves utilizing a Genetic Algorithm(GA)for identifying the Region of Interest(ROI)in the cover video.The ROI is the area in the video that is the most suitable for data embedding.The secret data is encrypted using the Advanced Encryption Standard(AES),which is a widely accepted encryption standard,before being embedded into the cover video,utilizing up to 10%of the cover video.This process ensures the security and confidentiality of the embedded data.The performance metrics for assessing the proposed method are the Peak Signalto-Noise Ratio(PSNR)and the encoding and decoding time.The results show that the proposed method has a high embedding capacity and efficiency,with a PSNR ranging between 64 and 75 dBs,which indicates that the embedded data is almost indistinguishable from the original video.Additionally,the method can encode and decode data quickly,making it efficient for real-time applications.
基金supported by Key R&D project of Zhejiang Province (2018C01005),http://kjt.zj.gov.cn/.
文摘Numerous clothing enterprises in the market have a relatively low efficiency of assembly line planning due to insufficient optimization of bottleneck stations.As a result,the production efficiency of the enterprise is not high,and the production organization is not up to expectations.Aiming at the problem of flexible process route planning in garment workshops,a multi-object genetic algorithm is proposed to solve the assembly line bal-ance optimization problem and minimize the machine adjustment path.The encoding method adopts the object-oriented path representation method,and the initial population is generated by random topology sorting based on an in-degree selection mechanism.The multi-object genetic algorithm improves the mutation and crossover operations according to the characteristics of the clothing process to avoid the generation of invalid offspring.In the iterative process,the bottleneck station is optimized by reasonable process splitting,and process allocation conforms to the strict limit of the station on the number of machines in order to improve the compilation efficiency.The effectiveness and feasibility of the multi-object genetic algorithm are proven by the analysis of clothing cases.Compared with the artificial allocation process,the compilation efficiency of MOGA is increased by more than 15%and completes the optimization of the minimum machine adjustment path.The results are in line with the expected optimization effect.