摘要
朴素贝叶斯最近邻(NBNN)分类算法具有非特征量化和图像-类别度量方式的优点,但算法运行速度较慢,分类正确率较低.针对此问题,提出一种朴素贝叶斯K近邻分类算法,基于快速近似最近邻(FLANN)搜索特征的K近邻用于分类决策并去除背景信息对分类性能的影响;为了进一步提高算法的运行速度及减少算法的内存开销,采用特征选择的方式分别减少测试图像和训练图像集的特征数目,并尝试同时减少测试图像和训练图像集中的特征数目平衡分类正确率与分类时间之间的矛盾.该算法保留了原始NBNN算法的优点,无需参数学习的过程,实验结果验证了算法的正确性和有效性.
Naive Bayes nearest neighbor( NBNN) classification algorithm possesses merits of avoiding feature quantization and image-to-class distance measurement,but it faces limitation of slow speed and low classification accuracy. To address the problem,a naive Bayes K-nearest neighbor classification algorithm was presented,where K-nearest neighbor searched by fast library for approximate nearest neighbors( FLANN) was employed and the influence of background information was removed. In order to improve the running speed and reduce memory cost,feature selection was included for reducing feature number of test and training images. And an attempt was tried to balance the contradictory between classification accuracy and classification time by reducing feature number of test image and training images simultaneously. The algorithm retains merits of original NBNN algorithm and requires no parameter learning process. Experimental results verify the correctness and effectiveness of the algorithm.
出处
《北京航空航天大学学报》
EI
CAS
CSCD
北大核心
2015年第2期302-310,共9页
Journal of Beijing University of Aeronautics and Astronautics
基金
国家自然科学基金资助项目(61172164)
关键词
图像分类
最近邻
K近邻
图像-类别距离
特征选择
image classification
nearest neighbor
K nearest neighbor
image-to-class distance
feature selection