期刊文献+

Decision Bayes Criteria for Optimal Classifier Based on Probabilistic Measures 被引量:1

Decision Bayes Criteria for Optimal Classifier Based on Probabilistic Measures
下载PDF
导出
摘要 This paper addresses the high dimension sample problem in discriminate analysis under nonparametric and supervised assumptions. Since there is a kind of equivalence between the probabilistic dependence measure and the Bayes classification error probability, we propose to use an iterative algorithm to optimize the dimension reduction for classification with a probabilistic approach to achieve the Bayes classifier. The estimated probabilities of different errors encountered along the different phases of the system are realized by the Kernel estimate which is adjusted in a means of the smoothing parameter. Experiment results suggest that the proposed approach performs well. This paper addresses the high dimension sample problem in discriminate analysis under nonparametric and supervised assumptions. Since there is a kind of equivalence between the probabilistic dependence measure and the Bayes classification error probability, we propose to use an iterative algorithm to optimize the dimension reduction for classification with a probabilistic approach to achieve the Bayes classifier. The estimated probabilities of different errors encountered along the different phases of the system are realized by the Kernel estimate which is adjusted in a means of the smoothing parameter. Experiment results suggest that the proposed approach performs well.
机构地区 the Cristal Laboratory
出处 《Journal of Electronic Science and Technology》 CAS 2014年第2期216-219,共4页 电子科技学刊(英文版)
关键词 Bayesian classifier dimension reduction kernel method optimization probabilistic dependence measure smoothing parameter Bayesian classifier,dimension reduction,kernel method,optimization,probabilistic dependence measure,smoothing parameter
  • 相关文献

参考文献11

  • 1W. Drira and F. Ghorbel, "Nonparametric feature discriminate analysis for high dimension," in Proc. of the 16th Int. Conf. on Image Processing, Computer Vision, and Pattern Recognition, Las Vegas, 2012, pp. 864-868.
  • 2W. Drira and F. Ghorbel, "Un estimateur de la L2 mesure de d6pendance probabiliste pour la r6duction de dimension vectorielle pour le multi classes," Traitement du Signal TS, vol. 29, no. 1-2, pp. 143-155, 2012. (in French).
  • 3P. M. Murphy and D. W. Aha. (2004). UCI Repository of Machine Learning Databases. [Online] Available: http://archive.ics.uci.edu/ml/citation_policy.html.
  • 4R. A, Fisher, "The use of multiple measurements in taxonomic problems," Annals of Eugenics, vol. 7, pp. 179-188, 1936.
  • 5M. Loog, R. P. W. Duin, and R. Haeb-Umbach, "Multiclass linear dimension reduction by weighted pairwise Fisher criteria," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 23, no. 7, pp. 762-766, 2001.
  • 6Z. Nenadic, "Information discriminate analysis: feature extraction with an information-theoric objective," IEEETrans. on Pattern Analysis and Machine Intelligence, vol. 29, no. 8, pp. 1394-1407, 2007.
  • 7E. A. Patrick and F. P. Fisher, "Nonparametric feature selection," IEEE Trans. on Inf Theory, vol. 15, no. 5, pp. 577-584, 1969.
  • 8W. Drira, W. Neji, and F. Ghorbel; "Dimension reduction by an orthogonal series estimate of the probabilistie dependence measure," in Proc. of Int. Conf. on Pattern Recognition Applications and Methods, Vilamoura, 2012, pp. 314-317.
  • 9S. Saoudi, M. Troudi, and F. Ghorbel, "An iterative soft bit error rate estimation of any digital communication systems using a nonparametric probability density function," EURASIP Journal on Wireless Communications and Networking, doi: 10.1155/2009/512192.
  • 10K. Fukunaga, Introduction to Statistical Pattern Recognition, 2nd ed. Salt Lake City: Academic Press, 1990.

同被引文献48

引证文献1

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部