期刊文献+

基于核的最小距离分类法的参数选择方法 被引量:2

Parameter Selection Method Based on Kernel Nearest Neighbor Classification
下载PDF
导出
摘要 在基于核函数的最小距离分类方法对数据集进行分类过程中,目标函数的核函数参数选择直接影响分类器的分类成功率。该文提出一种选择应用目标函数来选择适当参数的方法。实验结果表明,与单纯的基于核的最小距离分类法相比,选择最优核函数的参数可以提高分类器的成功率。 This paper proposes kernel nearest neighbor to classify the data sets. The parameter of the kernel function is the most difficulty question of this research topic, so this paper proposes a hybrid approach to use the target function to choose an adaptive parameter. Through testing the approach on a typical classification data sets, and the preliminary results demonstrate that, the target function can provide an adaptive parameter to optimize the kernel function for classification in various domains, especially compared with other kernel-based nearest neighbor classification methods.
出处 《计算机工程》 CAS CSCD 北大核心 2008年第5期188-190,共3页 Computer Engineering
基金 山东省科技攻关计划基金资助项目(2005GG4210002) 山东省青年科学家科研奖励基金资助项目(2006BS01020)
关键词 最小距离分类法 数据集 核函数 nearest neighbor classification data sets kernel function
  • 相关文献

参考文献6

  • 1Peng Jing, Heisterkamp D R. Adaptive Quasiconformal Kernel Nearest Neighbor Classification[J]. IEEE Trans. on Pattern Anal. Mach. Intell., 2004, 26(5): 656-661.
  • 2Zhang Daoqing, Chen Songcan. Clustering Incomplete Data Using Kernelbased Fuzzy C-means Algorithm[J]. Neural Process. Lett., 2003, 18(3): 155-162.
  • 3Wang Lei, Chan Kap Luk. Learning Kernel Parameters by Using Classseparability Measure[C]//Proc. of NIPS'02 Workshop on Kernel Machines. Whistier, Canada: [s. n.], 2002.
  • 4Zhang Daoqiang, Chen Songcan, Zhou Zhihua. Learning the Kernel Parameters in Kernel Minimum Distance Classifier[J]. Pattern Recognition, 2006, 39(1): 133-135.
  • 5Boser B, Guyon I, Vapnik V N. A Training Algorithm for Optimal Margin Classifiers[C]//Proc. of the 5th Annual ACM Workshop on Computational Learning Theory. New York: ACM Press, 1992: 144-152.
  • 6Vapnik V N. The Nature of Statistical Learning[M]. Berlin: Springer, 1995.

同被引文献16

  • 1任靖,李春平.最小距离分类器的改进算法——加权最小距离分类器[J].计算机应用,2005,25(5):992-994. 被引量:30
  • 2张翔,肖小玲,徐光祐.一种确定高斯核模型参数的新方法[J].计算机工程,2007,33(12):52-53. 被引量:12
  • 3魏孝章,豆增发.一种基于信息增益的K-NN改进算法[J].计算机工程与应用,2007,43(19):188-191. 被引量:9
  • 4Hu Mingqing,Chen Yiqiang,Kwok J T Y.Building sparse multiplekemel SVM classifiers[J].IEEE Transactions on Neural Networks, 2009,20(5) : 827-839.
  • 5Chapelle O, Vapnik V.Choosing multiple parameters for support vector machines[J].Machine Learning, 2002,46( 1 ) : 131-159.
  • 6Ratsch G, Onoda R G, Muller T K.Soft margins for AdaBoost[J]. Machine Learning,2001,42(3) :287-320.
  • 7Zhang Daoqiang,Chen Songcan,Zhou Zhihua.Learning the kernel parameters in kernel minimum distance classifier[J].Pattern Recognition, 2006,39( 1 ) :133-135.
  • 8Vapnik V N.The nature of statistical learning theory[M].New York: Springer, 1996.
  • 9HanJiawei MichelineKamber.数据挖掘概念与技术[M].北京:机械工业出版社,2001.152-160.
  • 10Jainak, Robert Pwduin, MAO J. statistical pattern recognition: a re-view. IEEE Transactions on Pattern Analysis and Machine Intelli-gence, 2000; 22 (1) : 4-37.

引证文献2

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部