Decision implication is a form of decision knowledge represen-tation,which is able to avoid generating attribute implications that occur between condition attributes and between decision attributes.Compared with other...Decision implication is a form of decision knowledge represen-tation,which is able to avoid generating attribute implications that occur between condition attributes and between decision attributes.Compared with other forms of decision knowledge representation,decision implication has a stronger knowledge representation capability.Attribute granularization may facilitate the knowledge extraction of different attribute granularity layers and thus is of application significance.Decision implication canonical basis(DICB)is the most compact set of decision implications,which can efficiently represent all knowledge in the decision context.In order to mine all deci-sion information on decision context under attribute granulating,this paper proposes an updated method of DICB.To this end,the paper reduces the update of DICB to the updates of decision premises after deleting an attribute and after adding granulation attributes of some attributes.Based on this,the paper analyzes the changes of decision premises,examines the properties of decision premises,designs an algorithm for incrementally generating DICB,and verifies its effectiveness through experiments.In real life,by using the updated algorithm of DICB,users may obtain all decision knowledge on decision context after attribute granularization.展开更多
Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce...Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce the number of decision implications. Moreover, based on the inference rule we introduce the notion of a-maximal decision implication and prove that the set of all α-maximal decision implications is α-complete and α-non-redundant. Finally, we propose a method to generate the set.展开更多
基金supported by the National Natural Science Foundation of China (Nos.61972238,62072294).
文摘Decision implication is a form of decision knowledge represen-tation,which is able to avoid generating attribute implications that occur between condition attributes and between decision attributes.Compared with other forms of decision knowledge representation,decision implication has a stronger knowledge representation capability.Attribute granularization may facilitate the knowledge extraction of different attribute granularity layers and thus is of application significance.Decision implication canonical basis(DICB)is the most compact set of decision implications,which can efficiently represent all knowledge in the decision context.In order to mine all deci-sion information on decision context under attribute granulating,this paper proposes an updated method of DICB.To this end,the paper reduces the update of DICB to the updates of decision premises after deleting an attribute and after adding granulation attributes of some attributes.Based on this,the paper analyzes the changes of decision premises,examines the properties of decision premises,designs an algorithm for incrementally generating DICB,and verifies its effectiveness through experiments.In real life,by using the updated algorithm of DICB,users may obtain all decision knowledge on decision context after attribute granularization.
文摘Serve to introduce decision context and decision implication to Formal Concept Analysis (FCA). Since extracting decision implications directly from decision context takes time, we present an inference rule to reduce the number of decision implications. Moreover, based on the inference rule we introduce the notion of a-maximal decision implication and prove that the set of all α-maximal decision implications is α-complete and α-non-redundant. Finally, we propose a method to generate the set.