期刊文献+

离散化方案的度量 被引量:1

Measurements of Discretization Schemes
原文传递
导出
摘要 分析数值决策表离散化方案的度量指标,包括断点数、条件信息熵、粒度熵、类-属性互信息、类-属性互相依赖冗余等.认为相容决策表的条件信息熵和类-属性互信息都是常数,对离散化方案不再有指导作用.讨论粒度熵与互相依赖冗余的关系,证明粒度熵随断点的加入而增加.设计实验度量这些指标之间的关系,实验发现,断点数和粒度熵与预测精度之间的相关程度不相上下,和具体的数据集有关. Several measurements of the discretization schemes for continuous decision tables are discussed, including cut-point number, conditional entropy, granular entropy, class-attribute mutual information and interdependence redundancy. For consistent decision table, conditional entropy and class-attribute mutual information are both constants, and thus they can not offer more information for discretization schemes. The relationship between granular entropy and interdependence redundancy is analyzed. And it is proved that granular entropy increases when new cut points are added to the discretization scheme. A hybrid discretization algorithm is proposed to provide discretization schemes for testing. The simulation results show that the correlation coefficient between the cut-point number and classification accuracy is basically equal to that between granular entropy and classification accuracy, and both of them are correlated to datasets.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2008年第4期494-499,共6页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金(No.60772028) 山东省自然科学基金(No.Y2006G22)资助项目
关键词 粒度熵 离散化方案 断点 分类精度 粗集 Granular Entropy, Discretization Scheme, Cut Point, Classification Accuracy, Rough Set
  • 相关文献

参考文献9

二级参考文献56

  • 1王珏,袁小红,石纯一,郝继刚.关于知识表示的讨论[J].计算机学报,1995,18(3):212-224. 被引量:54
  • 2王珏,苗夺谦,周育健.关于Rough Set理论与应用的综述[J].模式识别与人工智能,1996,9(4):337-344. 被引量:264
  • 3苗夺谦.Rough Set理论及其在机器学习中的应用研究(博士学位论文)[M].北京:中国科学院自动化研究所,1997..
  • 4Pawlak Z. Rough Sets[J]. International Journal of Computer and Information Sciences, 1982 ( 11 ): 341 - 356.
  • 5Pawlak Z. Rough Sets, Theoretical Aspects of Reasoning about Data[M]. Boston, MA: Kluwer Academic Publishers, 1991.
  • 6Theresa Beaubouf, Petry Frederick E, Gurdial Arora. InformationTheoretic Measures of Uncertainty for Rough Sets and Rough Relational Database [J ]. Journal of Information Sciences, 1998(109): 185 - 195.
  • 7Sukhamay Kundu. The Normal Form of a Granular Fuzzy Function [J]. Fuzzy Sets and Systems, 2001(124): 97- 107.
  • 8Amitava Roy, Pa Sankar K. Fuzzy Discretization of Feature Space for a Rough Set Classifier[J]. Pattern Recognition Letters, 2003(24): 895 - 902.
  • 9Swiniarski Roman W, Andrzej Skowron. Rough Set Methods in Feature Selection and Recognition[J]. Pattern Recognition Letters,2003 (24): 833 - 849.
  • 10UCI Machine Learning Repository [EB/OL]. http://www. ics.uci. edu/~mlearn/, 2002.

共引文献415

同被引文献9

  • 1杨明.一种基于改进差别矩阵的属性约简增量式更新算法[J].计算机学报,2007,30(5):815-822. 被引量:112
  • 2Pawlak Z.Rough sets [J].International Journal of Informa- tion and Computer Seienee,1982,11(5): 341-356.
  • 3Ming Wen Shao, Wen Xiu Zhang. Dominance Relation and Rules in an Ineomplete Ordered Information System [J].International Journal of Intelligent Systems,2005, 20: 13-27.
  • 4Li Wanqing, Ma Lihua, Meng Wenqing, Du Fengqiu. Analysis of Risk Decision of E-Commerce Project Based on Data Mining Of Rough Sets [C]. Proceedings of the 2007 WSEAS International Conference on Computer En- gineering and Applications, Gold Coast Australia, Jan- uary,2007,24(28) : 17-19.
  • 5L.Polkowski and P.Artiemjew.Rough Sets In Data Analysis [J]. Foundations and Applications, Studies in Computa- tional Intelligence(SCI) 2008,122: 33-54.
  • 6Junhong Wang,Jiye Liang,Yuhua Qian.Uncertainty Mea- sure of Rough Sets Based on A Knowledge Granulation for Incomplete Information Systems[J]. International Jour- nal of Uncertainty, Fuzziness and Knowledge-Based Sys- tems,2008,16(2) :233 -244.
  • 7王柯,朱启兵,崔宝同.决策表连续属性离散化的一种方法[J].计算机工程与应用,2008,44(30):148-149. 被引量:3
  • 8Dongbo ZHANG,Yaonan WANG.A new ensemble feature selection and its application to pattern classification[J].控制理论与应用(英文版),2009,7(4):419-426. 被引量:1
  • 9冯林,王国胤,李天瑞.连续值属性决策表中的知识获取方法[J].电子学报,2009,37(11):2432-2438. 被引量:15

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部