摘要
恰当的超参数设置是决定深度模型性能的关键因素,实现优秀高效的超参数优化算法能够提高深度学习模型的效果,提升模型超参数搜索调优的效率和速度,降低深度学习模型的应用门槛。超参数优化算法的典型代表是贝叶斯优化算法(BOA),此类基于代理模型的全局优化算法,相对随机搜索、网格搜索等简单算法理论上具备更好的优化效率。本文提出基于超限学习机(ELM)对超参数空间建立确定性代理模型,并改进随机响应面方法,实现了一种针对深度学习模型的超参数优化算法SurroOpt1。实验表明,本文提出的算法,在深度卷积网络模型超参数优化任务中,相对贝叶斯优化和TPE算法这2种最先进的已知算法,在函数求解次数相同的情况下,具备更好的模型优化效果。
Appropriate setting of hyperparameters is a critical factor that determines the performance of a deep learning model.Realization of highly efficient hyperparameter tuning algorithm contributes to improvement of the speed and efficiency of deep learning application,and reducing the difficulty of applying deep learning model.One of the state-of-art hyperparameter tuning algorithms is Bayesian optimization algorithm(BOA)based on surrogating model.Theoretically the performance and efficiency of algorithm based on surrogating model can be superior to several simple hyperparameter tuning algorithms such as grid search and random search.This article presents a high performance hyperparameter tuning algorithm SurroOpt1 that adopts extreme learning machine(ELM)as deterministic surrogating model and an improved random respond surface method as optimization strategy.Experiments prove that the proposed algorithm can achieve superior performance and efficiency in hyperparameter tuning task for deep convolutional neural nets to Bayesian optimization algorithm and tree-structured Parzen estimator(TPE)algorithm,which are two of the state-of-art algorithms.
作者
孙永泽
陆忠华
Sun Yongze;Lu Zhonghua(Computer Network Information Center, Chinese Academy of Sciences, Beijing 100190;University of Chinese Academy of Sciences, Beijing 100049)
出处
《高技术通讯》
EI
CAS
北大核心
2019年第12期1165-1174,共10页
Chinese High Technology Letters
基金
国家自然科学基金(61873254)资助项目