摘要
The convergence of genetic algorithm is mainly determined by its core operation crossover operation. When the objective function is a multiple hump function, traditional genetic algorithms are easily trapped into local optimum, which is called premature conver- gence. In this paper, we propose a new genetic algorithm with improved arithmetic crossover operation based on gradient method. This crossover operation can generate offspring along quasi-gradient direction which is the Steepest descent direction of the value of objective function. The selection operator is also simplified, every individual in the population is given an opportunity to get evolution to avoid complicated selection algorithm. The adaptive mutation operator and the elitist strategy are also applied in this algorithm. The case 4 indicates this algorithm can faster converge to the global optimum and is more stable than the conventional genetic algorithms.
The convergence of genetic algorithm is mainly determined by its core operation crossover operation. When the objective function is a multiple hump function, traditional genetic algorithms are easily trapped into local optimum, which is called premature conver- gence. In this paper, we propose a new genetic algorithm with improved arithmetic crossover operation based on gradient method. This crossover operation can generate offspring along quasi-gradient direction which is the Steepest descent direction of the value of objective function. The selection operator is also simplified, every individual in the population is given an opportunity to get evolution to avoid complicated selection algorithm. The adaptive mutation operator and the elitist strategy are also applied in this algorithm. The case 4 indicates this algorithm can faster converge to the global optimum and is more stable than the conventional genetic algorithms.