期刊文献+

采用种群平均信息和精英变异的改进粒子群算法 被引量:4

Improved particle swarm optimization algorithm using mean information and elitist mutation
下载PDF
导出
摘要 针对基本粒子群优化(PSO)算法早熟收敛和后期搜索效率低的问题,提出一种利用种群平均信息和精英变异的粒子群优化算法——MEPSO算法。该算法引入粒子个体与群体的平均信息,利用粒子平均信息来提高算法全局搜索能力,并采用时变加速系数(TVAC)以平衡算法的局部搜索和全局搜索能力;在算法后期,采用精英学习策略对精英粒子进行柯西变异操作,以进一步提高算法的全局搜索能力,减少算法陷入局部最优的危险。在6个典型的复杂函数上与基本PSO(BPSO)算法、时变加速因子PSO(PSO-TVAC)算法、时变惯性权重PSO(PSO-TVIW)算法和小波变异PSO(HPSOWM)算法进行对比,MEPSO的均值与标准方差均优于对比算法,且寻优时间最短,可靠性更好。结果表明,MEPSO能较好地兼顾局部搜索和全局搜索能力,收敛速度快,收敛精度和搜索效率高。 Concerning that conventional Particle Swarm Optimization( PSO) is easy trapped in local optima and with low search efficiency in later stage, an improved PSO based on mean information and elitist mutation, named MEPSO, was proposed. Average information of swarm was introduced into MEPSO to improve the global search ability, and Time-Varying Acceleration Coefficient( TVAC) strategy was adopted to balance the local search and global search ability. In the latter stage of the iteration, the Cauchy mutation operation was applied to the global best particle to improve the global search ability and to further reduce the risk of trapping into local optimum. Contrast experiments on six benchmark functions were given.Compared with Basic PSO( BPSO), PSO with TVAC( PSO-TVAC), PSO with Time-Varying Inertia Weight factor( PSOTVIW) and Hybrid PSO with Wavelet Mutation( HPSOWM), MEPSO achieved better mean value and standard variance with shorter optimization time and better reliability. The results show that MEPSO can better balance the ability of local search and global search, and can converge faster with higher accuracy and efficiency.
出处 《计算机应用》 CSCD 北大核心 2014年第11期3241-3244,3249,共5页 journal of Computer Applications
基金 国家自然科学基金资助项目(61174140) 中国博士后科学基金资助项目(2013M540628) 湖南省自然科学基金资助项目(14JJ3107)
关键词 粒子群优化 平均搜索 柯西变异 时变加速因子 全局搜索 Particle Swarm Optimization(PSO) mean search Cauchy mutation Time-Varying Acceleration Coefficient(TVAC) global search
  • 相关文献

参考文献5

二级参考文献58

  • 1赫然,王永吉,王青,周津慧,胡陈勇.一种改进的自适应逃逸微粒群算法及实验分析[J].软件学报,2005,16(12):2036-2044. 被引量:134
  • 2任子武,伞冶.实数遗传算法的改进及性能研究[J].电子学报,2007,35(2):269-274. 被引量:42
  • 3EBERHART R,KENNEDY J A.A new optimizer using particle swarmtheory[C] //Proceeding of International Symposium on Micromachine and Human Science.Nagoya,Japan:IEEE,1995:39-43.
  • 4HO S Y,LIN H S,LIAUH W H,et al.OPSO:Orthogonal particle swarm optimization and its application to task assignment problems[J].IEEE Transactions on Systems,Man,and Cybernetics,Part A:System and Humans,2008,38(2):288-298.
  • 5RATNAWEERA A,HALGAMUGE S K,WWATSON H C.Selforganizing hierarchical particle swarm optimizer with time-varying acceleration coefficients[J].IEEE Transactions on Evolutionary Computation,2004,8(3):240-255.
  • 6EBERHART R C,SHI Y H.Guest editorial-special issue particle swarm optimization[J].IEEE Transactions on Evolutionary Computation,2004,8(3):201-203.
  • 7CHEN Y P,PENG W C,ANDJIAN M C.Particle swarm optimization withrecombination and dynamic linkage discovery[J].IEEE Transactions on Systems,Man,and Cybernetics,Part B:Cybernetics,2007,37(6):1460-1470.
  • 8ANDREWS P S.An investigation into mutation operators for partieleswarm optimization[C] //Proceeding of IEEE Congress on Evolutionary Computation.Vancouver,BC,Canada,IEEE,2006:1044-1051.
  • 9LIANG J J,SUGANTHAN P N.Dynamic multi-swarm particle swarm optimizer with local search[C] //Proceeding of IEEE Congress on Evolutionary Computation.Singapore,IEEE,2005,1:522-528.
  • 10LIANG J J,QIN A K,SUGANTHAN p N,et al.Comprehensivelearning particle swarm optimizer for global optimization of multimodalfunctions[J].IEEE Transactions on Evolutionary Computation,2006,10(3):281-295.

共引文献170

同被引文献50

引证文献4

二级引证文献23

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部