期刊文献+

用于无人机巡线的图像分类模型选择算法研究

Research on Algorithm of Image Classification Model Selection for UAV Patrol
下载PDF
导出
摘要 随机森林算法作为经典的分类算法,应用广泛,分类的准确度高。但在分类的过程之中,各个决策树的分类性能和两两决策树之间的差异性是影响最终分类效果的两个重要因素,当部分决策树有相似的错误分类情况,在最终利用决策树的结果进行投票时,将降低模型最终的分类效果。针对该问题,本文将误差矩阵引入分类树的相似性度量当中。该方法考虑了不同类别的树的数量、分类正确错误的情况,以便选出相似度弱的决策树,然后,剔除分类能力差的决策树,最终选择出分类能力强的分类器集合。实验结果显示,本文提出的方法在3类数据集中,平均分类正确率高于原算法,且稳定性更高。 As a classic classification algorithm, random forest algorithm is widely used and has high classification accuracy. However, in the process of classification, the classification performance of each decision tree and the difference between two decision trees are two important factors that affect the final classification effect. When some decision trees have similar misclassifications, and they are used in the final voting on the results of the decision tree, the final classification effect of the model will be reduced. Aiming at this problem, this paper proposes a method for measuring the similarity of decision trees based on confusion Matrix. This method takes into account the number of different categories of trees and the correct and incorrect classification, in order to select decision trees with weak similarity, and then remove the decision trees with poor classification results, and finally complete the model selection of random forest. Experimental results show that the method pro-posed in this paper has a higher average classification accuracy rate and higher stability in the three types of datasets.
出处 《计算机科学与应用》 2020年第9期1541-1548,共8页 Computer Science and Application
关键词 集成分类器 随机森林 误差矩阵 Integrated Classifier Random Forest Confusion Matrix
  • 相关文献

参考文献7

二级参考文献80

  • 1东明,郭亚军,郭宏.统一价格竞价机制下发电商报价策略研究[J].系统工程理论与实践,2004,24(4):83-87. 被引量:11
  • 2李凯,黄厚宽.一种基于聚类技术的选择性神经网络集成方法[J].计算机研究与发展,2005,42(4):594-598. 被引量:24
  • 3张静,宋锐,郁文贤,夏胜平,胡卫东.基于混淆矩阵和Fisher准则构造层次化分类器[J].软件学报,2005,16(9):1560-1567. 被引量:27
  • 4王晓丹,孙东延,郑春颖,张宏达,赵学军.一种基于AdaBoost的SVM分类器[J].空军工程大学学报(自然科学版),2006,7(6):54-57. 被引量:22
  • 5Martinez-MuHoz G, Suarez A. Switching class labels to generate classification ensembles [J]. Pattern Recognition, 2005, 38(10): 1483-1494
  • 6Martinez-Munoz G, Sudrez A. Using boosting to prune bagging ensembles [J]. Pattern Recognition Letters, 2007, 28(1): 156-165
  • 7Zhou Z H, Wu J, Tang W . Ensembling neural networks: Many could be better than all [J]. Artificial Intelligence, 2002, 137(1-2): 239-263
  • 8Giacinto G, Roll F. An approach to the automatic design of multiple classifier systems [J]. Pattern Recognition Letters, 2001, 22(1): 25-33
  • 9Tamon C, xiang J. On the boosting pruning problem [C] // Proc of the llth European Conf on Maehine Learning. Berlin: Springer, 2000:404-412
  • 10Breiman L. Bagging predictors [J]. Machine Learning, 1996, 24(2): 123-140

共引文献73

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部