摘要
差分进化算法简单、高效且鲁棒性好.然而在求解大规模优化问题时,其性能随着问题维度的增加会迅速降低.针对此问题,提出一种基于MapReduce编程模型的分布式差分进化算法.算法采用改进的精英学习策略和岛模型两种机制,提高算法的收敛精度.利用MapReduce并行编程模型,构建分布式差分进化算法,并将其部署到分布式集群Hadoop上.利用13个标准测试问题进行仿真实验,实验结果表明该算法求解精度高,且具有较好的加速比和扩展性,是求解大规模优化问题的有效方法.
Differential Evolution is very simple, efficient and robust. However, when dealing with large scale optimization problem, the performance of Differential Evolution will deteriorate rapidly as the dimensionality of the search space increases. To overcome this problem, a Distributed Differential Evolution algorithm Based on MapReduce Model was proposed. Firstly, the elite learning strategy and island Model were used to improve the convergence accuracy. And Secondly, with MapReduce model, the Distributed Differential Evolution algorithm was constructed. Then it is deployed on Hadoop cluster. The proposed algorithm has been tested on 13 Benchmark Functions,and experimental result shows that the performances of the new algorithm is competitive,and has good performances of speedup and scalability, thus it is an effective method for solving large scale optimization problem.
出处
《小型微型计算机系统》
CSCD
北大核心
2016年第12期2695-2701,共7页
Journal of Chinese Computer Systems
基金
国家自然科学基金项目(61364025)资助
武汉大学软件工程国家重点实验室开放基金项目(SKLSE2012-09-39)资助
江西省教育厅科学技术项目(GJJ13729
GJJ14742)资助
九江学院科研项目(2013KJ27
2014KJYB034
2015LGYB29)资助
关键词
大规模优化
分布式差分进化
岛模型
精英学习
large scale optimization
distributed differential evolution
island model
elite learning