In many machine learning applications,data are not free,and there is a test cost for each data item. For the economical reason,some existing works try to minimize the test cost and at the same time,preserve a particul...In many machine learning applications,data are not free,and there is a test cost for each data item. For the economical reason,some existing works try to minimize the test cost and at the same time,preserve a particular property of a given decision system. In this paper,we point out that the test cost one can afford is limited in some applications. Hence,one has to sacrifice respective properties to keep the test cost under a budget. To formalize this issue,we define the test cost constraint attribute reduction problem,where the optimization objective is to minimize the conditional information entropy. This problem is an essential generalization of both the test-cost-sensitive attribute reduction problem and the 0-1 knapsack problem,therefore it is more challenging. We propose a heuristic algorithm based on the information gain and test costs to deal with the new problem. The algorithm is tested on four UCI(University of California-Irvine) datasets with various test cost settings. Experimental results indicate the appropriate setting of the only user-specified parameter λ.展开更多
We describe a new active-set, cutting-plane Constraint Optimal Selection Technique (COST) for solving general linear programming problems. We describe strategies to bound the initial problem and simultaneously add mul...We describe a new active-set, cutting-plane Constraint Optimal Selection Technique (COST) for solving general linear programming problems. We describe strategies to bound the initial problem and simultaneously add multiple constraints. We give an interpretation of the new COST’s selection rule, which considers both the depth of constraints as well as their angles from the objective function. We provide computational comparisons of the COST with existing linear programming algorithms, including other COSTs in the literature, for some large-scale problems. Finally, we discuss conclusions and future research.展开更多
基金supported by the National Natural Science Foundation of China under Grant No. 60873077/F020107
文摘In many machine learning applications,data are not free,and there is a test cost for each data item. For the economical reason,some existing works try to minimize the test cost and at the same time,preserve a particular property of a given decision system. In this paper,we point out that the test cost one can afford is limited in some applications. Hence,one has to sacrifice respective properties to keep the test cost under a budget. To formalize this issue,we define the test cost constraint attribute reduction problem,where the optimization objective is to minimize the conditional information entropy. This problem is an essential generalization of both the test-cost-sensitive attribute reduction problem and the 0-1 knapsack problem,therefore it is more challenging. We propose a heuristic algorithm based on the information gain and test costs to deal with the new problem. The algorithm is tested on four UCI(University of California-Irvine) datasets with various test cost settings. Experimental results indicate the appropriate setting of the only user-specified parameter λ.
文摘We describe a new active-set, cutting-plane Constraint Optimal Selection Technique (COST) for solving general linear programming problems. We describe strategies to bound the initial problem and simultaneously add multiple constraints. We give an interpretation of the new COST’s selection rule, which considers both the depth of constraints as well as their angles from the objective function. We provide computational comparisons of the COST with existing linear programming algorithms, including other COSTs in the literature, for some large-scale problems. Finally, we discuss conclusions and future research.