期刊文献+

基于特征重要性的深度学习自动调度优化研究

Study on Deep Learning Automatic Scheduling Optimization Based on Feature Importance
下载PDF
导出
摘要 随着深度学习和硬件架构的快速发展,模型和硬件架构的多样性导致采用手工优化方式实现深度学习模型的高性能部署面临严峻的挑战,因此现有的AI编译器框架通常采用自动调度的方法来实现这一过程。但是已有的TVM自动调度优化方法中存在着代价模型数据集不平衡以及调度时间过长的问题,为了解决这些问题,提出了一种基于特征重要性的自动调度优化方法。首先采用xgboost算法对特征重要性进行分析,然后基于重要性系数降低特征维度并对数据标签值进行重分配,以实现提高代价模型精度和优化自动调度效率的目的。实验结果表明,应用所提优化方法,使3种深度学习模型的自动调度时间缩短了9.7%~17.2%,推理时间最多缩短了15%。 With the rapid development of deep learning and hardware architectures,the diversity of models and hardware architectures make the deployment for deep learning models with high performance manually become increasingly challenging.So current AI compiler framework often adopts automatic scheduling.Since the existing optimization to TVM automatic scheduling has such issues as unbalanced data sets in cost model and overlong scheduling time,an automatic scheduling optimization strategy based on feature importance is designed in this paper.First,the feature importance is analyzed through the xgboost algorithm.Then a stra-tegy that reduce the data feature dimensions based on the importance coefficient and reassign the data labels is adopted to improve the precision of the cost model and optimize the efficiency of the automatic scheduling.Experiment results show that the proposed optimization method can reduce the automatic scheduling time of three kinds of deep learning models by 9.7%~17.2%,and reduce the inference time by up to 15%.
作者 杨恒 刘勤让 范旺 裴雪 魏帅 王轩 YANG Heng;LIU Qinrang;FAN Wang;PEI Xue;WEI Shuai;WANG Xuan(College of Cyberspace Security,Zhengzhou University,Zhengzhou 450003,China;Institute of Information Technology,Information Engineering University,Zhengzhou 450002,China)
出处 《计算机科学》 CSCD 北大核心 2024年第7期22-28,共7页 Computer Science
基金 国家重点研发计划重点专项(2022YFB4401401) 嵩山实验室项目(纳入河南省重大科技专项管理系)(221100211100-01)。
关键词 AI编译器 自动调度 xgboost 特征重要性 深度学习 AI compiler Automatic scheduling xgboost Feature importance Deep learning
  • 相关文献

参考文献2

二级参考文献4

共引文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部