期刊文献+

智能启发算法在机器学习中的应用研究综述 被引量:12

Survey of research on application of heuristic algorithm in machine learning
下载PDF
导出
摘要 针对机器学习算法在应用中存在的问题,构建基于智能启发算法的机器学习模型优化体系。首先,介绍已有智能启发算法类型及其建模过程。然后,从智能启发算法在机器学习算法中的应用,包括神经网络等参数结构优化、特征优化、集成约简、原型优化、加权投票集成和核函数学习等方面说明智能启发算法的优势。最后,结合实际需求展望智能启发算法及在机器学习领域的发展方向。 Aiming at the problems existing in the application of machine learning algorithm,an optimization system of the machine learning model based on the heuristic algorithm was constructed.Firstly,the existing types of heuristic algorithms and the modeling process of heuristic algorithms were introduced.Then,the advantages of the heuristic algorithm were illustrated from its applications in machine learning,including the parameter and structure optimization of neural network and other machine learning algorithms,feature optimization,ensemble pruning,prototype optimization,weighted voting ensemble and kernel function learning.Finally,the heuristic algorithms and their development directions in the field of machine learning were given according to the actual needs.
作者 沈焱萍 郑康锋 伍淳华 杨义先 SHEN Yanping;ZHENG Kangfeng;WU Chunhua;YANG Yixian(School of Cyberspace Security,Beijing University of Posts and Telecommunications,Beijing 100876,China;School of Information Engineering,Institute of Disaster Prevention,Langfang 065201,China)
出处 《通信学报》 EI CSCD 北大核心 2019年第12期124-137,共14页 Journal on Communications
基金 国家自然科学基金资助项目(No.61602052) 国家重点研发计划基金资助项目(No.2017YFB0802803)~~
关键词 参数结构优化 特征优化 集成约简 原型优化 加权投票集成 核函数学习 parameter and structure optimization feature optimization ensemble pruning prototype optimization weighted voting ensemble kernel function learning
  • 相关文献

参考文献1

二级参考文献52

  • 1Li G-Z, Yang J Y. Feature selection for ensemble learning and its application[M]. Machine Learning in Bioinformatics, 2008: 135-155.
  • 2Sheinvald J, Byron Dom, Wayne Niblack. A modelling approach to feature selection[J]. Proc of 10th Int Conf on Pattern Recognition, 1990, 6(1): 535-539.
  • 3Cardie C. Using decision trees to improve case-based learning[C]. Proc of 10th Int Conf on Machine Learning. Amherst, 1993: 25-32.
  • 4Modrzejewski M. Feature selection using rough sets theory[C]. Proc of the European Conf on Machine ,Learning. 1993: 213-226.
  • 5Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data[J]. J of Bioinformatics and Computational Biology, 2005, 3(2): 185-205.
  • 6Francois Fleuret. Fast binary feature selection with conditional mutual information[J]. J of Machine Learning Research, 2004, 5(10): 1531-1555.
  • 7Kwak N, Choi C-H. Input feature selection by mutual information based on Parzen window[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2002, 24(12): 1667-1671.
  • 8Novovicova J, Petr S, Michal H, et al. Conditional mutual information based feature selection for classification task[C]. Proc of the 12th Iberoamericann Congress on Pattern Recognition. Valparaiso, 2007: 417-426.
  • 9Qu G, Hariri S, Yousif M. A new dependency and correlation analysis for features[J]. IEEE Trans on Knowledge and Data Engineering, 2005, 17(9): 1199- 1207.
  • 10Forman G. An extensive empirical study of feature selection metrics for text classification[J]. J of Machine Learning Research, 2003, 3(11): 1289-1305.

共引文献201

同被引文献116

引证文献12

二级引证文献36

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部