期刊文献+

一种非线性新相关信息熵定义及其性质、应用 被引量:12

The Concept of a New Nonlinear Correlation Information Entropy and Its Properties and Applications
下载PDF
导出
摘要 在研究了相关信息熵和H_(Pal)熵的基础上,提出一种以特征值代替事件发生的概率且以e为底的指数函数形式的改进的非线性新相关信息熵概念.在对有限集最大划分的条件下,推导并从理论上证明了该信息熵的若干性质,这些性质满足香农熵的基本性质.新相关信息熵是一种度量多变量、非线性系统的相关性程度大小的标准.作为多变量之间相关关系的不确定性度量,变量间的相关程度越大,对应的新相关信息熵值越小.新相关信息熵的提出有助于信息融合并为相关分析理论的研究提供了一种新方法和新思路.新相关信息熵和相关信息熵的应用实例结果对比说明新相关信息熵是一种有效且有用的度量非线性系统不确定性的方法. The concept of a new nonlinear correlation information entropy(NNCIE),which uses eigenvalue to replace event probability and the function form is an exponential form whose base is e,is proposed based on the study of correlation information entropy(CIE) and H_(Pal) entropy.Under the condition of the largest partition of finite sets,some properties of this information entropy are derived and proved theoretically and these properties meet the basic properties of the information entropy proposed by Shannon.The NNCIE is a measurement criterion of multi-variable and nonlinear system's correlation degree.As an uncertainty measurement of multi-variable correlation,the more correlation information between variables is contained,the smaller value of corresponding NNCIE is.The NNCIE contributes to information fusion and provides a new method and idea for the research of correlation analysis theory.The results of contrast between NNCIE and CIE show that NNCIE is an effective and useful measurement method of nonlinear system's uncertainty.
机构地区 西安科技大学
出处 《信息与控制》 CSCD 北大核心 2011年第3期401-407,412,共8页 Information and Control
基金 陕西省科技攻关项目(2008K01-58) 陕西省教育厅自然科学专项计划资助项目(07JK314)
关键词 相关信息熵 H_Pal熵 非线性新相关信息熵 correlation information entropy H_(Pal) entropy nonlinear new correlation information entropy
  • 相关文献

参考文献2

二级参考文献13

  • 1John G H,Kohavi R,Pfleger K.Irrelevant Features and the Subset Selection Problem.In:Proc.of the Eleventh Intl.Conf.on Machine Learning,1994.121~129
  • 2Kohavi R,John G H.Wrappers for feature subset selection.Artificial Intelligence,1997,97 (1-2):273~324
  • 3Liu Huan,Yu Lei.Toward Integrating Feature Selection Algorithms for Classification and Clustering.IEEE Transactions on Knowledge and Data Engineering,2005,17(5):491~502
  • 4Yang J,Honavar V.Feature subset selection using a genetic algorithm.IEEE Intelligent Systems,1998,13(2):44~49
  • 5YU Lei,Liu Huan.Efficient Feature Selection via Analysis of Relevance and Redundancy.Journal of Machine Learning Research,2004(5):1205~1224
  • 6Mitra P,Murthy C A.Pal S K.Unsupervised Feature Selection Using Feature Similarity.IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(3):301~312
  • 7SHANNON C E. A mathematical theory of communication[J]. Bell Syst. Tech. J,1948,27:379-423.
  • 8RENYI A. On the measures of entropy and information [C]//Proc. 4th Berkeley Symp.Math. Statist. and Prob, Berkeley, CA: University of California Press, 1961.
  • 9BREIMAN L, FRIEDMAN J, OLSHEN R,et al. Classification and regression trees[EB/OL]. (1984-11-12 [2006-04-21]. http.//www. statsoft.com/textbook/stcart.html.
  • 10SIMOVICI D A ,JAROSZEWICZ S. A metric approach to building decision trees basedon goodman-kruskal association index. [EB/OL]. (2004-12-23)[2006-04-12] . http.//www. cs. umb. edu/-dsim/papersps/gk.pdf.

共引文献20

同被引文献135

引证文献12

二级引证文献74

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部