摘要
属性约简是粗糙集理论研究的重要内容之一.在传统Pawlak粗糙集模型中,随着属性数量的单调变化,下、上近似集也单调变化.然而,在决策粗糙集模型中,随着属性的单调增加,下、上近似集有可能增加也有可能减少.针对这一问题,从优化角度给出了决策单调准则、一般性准则和代价准则的适应性函数并通过遗传算法求得三种准则下的约简.实验结果表明:决策单调准则约简获得了更多的正域规则;一般性准则约简获取了最多的正域规则;代价准则约简获得了最小的决策代价.
Attribute reduction is one of the important research issues in rough set theory. In classical Pawlak rough set,the lower and upper approximations are monotonic with respect to the set inclusion of attributes. However,in decisiontheoretic rough set model,the lower and upper approximations may increase or decrease with respect to the increasing of attributes. To address this issue,from the viewpoint of optimization,fitness functions of the decision-monotonicity criterion,generality criterion and cost criterion have been proposed respectively. Genetic algorithm is also applied to compute reducts. The experimental results show that: the reducts based on decision-monotonicity criterion can generate more positive rules; the reducts based on generality criterion can generate most positive rules; the reducts based on cost criterion can obtain lowest decision costs.
出处
《南京师大学报(自然科学版)》
CAS
CSCD
北大核心
2015年第1期41-47,共7页
Journal of Nanjing Normal University(Natural Science Edition)
基金
国家自然科学基金(61100116
61272419
61305058)
江苏省自然科学基金(BK2011492
BK2012700
BK20130471)
高维信息智能感知与系统教育部重点实验室(南京理工大学)开放基金(30920130122005)
中国博士后科学基金(2014M550293)
关键词
属性约简
代价
决策粗糙集
决策单调
一般性准则
attribute reduction
cost criterion
decision-theoretic rough set
decision-monotonicity
generality criterion