期刊文献+

基于径向基函数网络估计密度函数的数据分类

Data classification based on density function estimated by radial basis function
下载PDF
导出
摘要 针对样本总体分布已知的分类问题,提出了一种新的分类方法.通过非线性映射将训练样本映射到高维特征空间,基于向量投影法从训练样本中选择边界向量,运用多维二叉树搜索法确定每个边界向量同类中的k近邻,运用统计理论中的大数定理估计样本的类条件概率密度函数,由边界向量与相应的密度函数构成新的训练样本对.对每一类数据建立一个径向基函数(RBF)网络,以相应类的边界向量作为中心,通过训练以RBF网络来估计样本的类条件概率密度,并采用基于最小错误率的贝叶斯决策来实现分类.对机器学习数据的仿真研究结果表明该方法具有与支持向量机(SVM)相似的识别率,并且能快速有效地实现多类分类. A new method was proposed for classification of samples with known distribution. A nonlinear function was used to map the input to a higher-dimensional space, vectors near the boundaries were pre-extracted from the training samples based on vector projection, and searching based on the multi-dimension binary tree was implemented to get the k nearest vectors of each boundary vector. By the law of large numbers in statistics, the value of the class-conditional probability density function of a sample was estimated. A new training data set was composed of boundary vectors and their corresponding density function. A radial basis function (RBF) network was constructed with the boundary vectors as the network centers to approximate the class-conditional probability density function of each class of objects in the training data set. The classification conformed to the minimum error rate of the Bayesian decision rule. Simulation results of machine learning data sets showed that the proposed algorithm has the same level of accuracy with the support vector machines (SVM) in data classification, and can quickly and effectively classify data with more than two classes of objects.
出处 《浙江大学学报(工学版)》 EI CAS CSCD 北大核心 2007年第7期1088-1092,共5页 Journal of Zhejiang University:Engineering Science
基金 浙江省自然科学基金资助项目(Y106085)
关键词 模式分类 边界向量预选取 径向基函数网络 贝叶斯决策 pattern classification pre-extracting boundary vectors radial basis function (RBF) network Bayesian decision rule
  • 相关文献

参考文献13

  • 1杜树新,吴铁军.模式识别中的支持向量机方法[J].浙江大学学报(工学版),2003,37(5):521-527. 被引量:118
  • 2HSU C W,LIN C J.A comparison of methods for multi-class support vector machines[J].IEEE Transactions on Neural Network,2002,13(2):415-425.
  • 3OSUNA E,FREUND R,GIRROSI F.Improved training algorithm for support vector machines[C]∥Proceedings of IEEE NNSP'97.New Jersey:IEEE,1997.
  • 4PLATT J C.Fast training of support vector machines using sequential minimal optimization[EB/OL].[2005-10-12].http://research.microsoft.com/-jplatt.
  • 5焦李成,张莉,周伟达.支撑矢量预选取的中心距离比值法[J].电子学报,2001,29(3):383-386. 被引量:48
  • 6李青,焦李成,周伟达.基于向量投影的支撑向量预选取[J].计算机学报,2005,28(2):145-152. 被引量:37
  • 7OYANG Y J,HWANG S C,OU Y Y,et al.Data classification with radial basis function networks based on a novel kernel density estimation algorithm[J].IEEE Transactions on Neural networks,2005,16(1):225-236.
  • 8孙健,申瑞民,韩鹏.一种新颖的径向基函数(RBF)网络学习算法[J].计算机学报,2003,26(11):1562-1567. 被引量:32
  • 9FERAUD R,CLEEROT F.A methodology to explain neural network classification[J].Neural Networks,2002,15(2):237-246.
  • 10边肇棋 张学工.模式识别[M].北京:清华大学出版社,2000..

二级参考文献43

  • 1VAPNIK V N. The nature of statistical learning [M].Berlin:Springer, 1995.
  • 2VAPNIK V N. Statistical learning theory [M]. New York:John Wiley & Sons, 1998.
  • 3SCHōLKOPH B, SMOLA A J, BARTLETT P L. New support vector algorithms[J]. Neural Computation.2000, 12(5):1207--1245.
  • 4SUYKENS J A K, VANDEWALE J. Least squares support vector machine classifiers[J]. Neural Processing Letters, 1999, 9(3): 293--300.
  • 5CHEW H-G, BOGNER R E, LIM C-C, Dual v-support vector machine with error rate and training size beasing[A]. Proceedings of 2001 IEEE Int Conf on Acoustics,Speech, and Signal Processing [C]. Salt Lake City,USA: IEEE, 2001. 1269--1272.
  • 6LIN C-F, WANG S-D. Fuzzy support vector machines[J]. IEEE Trans on Neural Networks, 2002, 13(2):464--471.
  • 7SUYKENS J A K, BRANBANTER J D, LUKAS L, et al. Weighted least squares support vector machines:robustness and spare approximation[J]. Neuroeomputing, 2002, 48(1): 85--105.
  • 8ROOBAERT D. DirectSVM: A fast and simple support vector machine perception [A]. Proceedings of IEEE Signal Processing Society Workshop[C]. Sydney, Australia: IEEE, 2000. 356--365.
  • 9DOMENICONI C. GUNOPULOS D. Incremental support vector machine construction [A]. Proceedings of IEEE Int Conf on Data Mining[C]. San Jose, USA:IEEE,2001. 589--592.
  • 10OSUNA E, FREUND R, GIROSI F. An improved training algorithm for support vector machine [A].Proceedings of 1997 IEEE Workshop on Neural Networks for Signal Processing[C]. Amelea Island, FL:IEEE, 1997. 276--285.

共引文献286

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部