We propose a novel discriminative learning approach for Bayesian pattern classification, called 'constrained maximum margin (CMM)'. We define the margin between two classes as the difference between the minimum de...We propose a novel discriminative learning approach for Bayesian pattern classification, called 'constrained maximum margin (CMM)'. We define the margin between two classes as the difference between the minimum decision value for positive samples and the maximum decision value for negative samples. The learning problem is to maximize the margin under the con- straint that each training pattern is classified correctly. This nonlinear programming problem is solved using the sequential un- constrained minimization technique. We applied the proposed CMM approach to learn Bayesian classifiers based on Gaussian mixture models, and conducted the experiments on 10 UCI datasets. The performance of our approach was compared with those of the expectation-maximization algorithm, the support vector machine, and other state-of-the-art approaches. The experimental results demonstrated the effectiveness of our approach.展开更多
基金Project supported by the National Natural Science Foundation of China(Nos.60973059 and 81171407)the Program for New Century Excellent Talents in University,China(No.NCET-10-0044)
文摘We propose a novel discriminative learning approach for Bayesian pattern classification, called 'constrained maximum margin (CMM)'. We define the margin between two classes as the difference between the minimum decision value for positive samples and the maximum decision value for negative samples. The learning problem is to maximize the margin under the con- straint that each training pattern is classified correctly. This nonlinear programming problem is solved using the sequential un- constrained minimization technique. We applied the proposed CMM approach to learn Bayesian classifiers based on Gaussian mixture models, and conducted the experiments on 10 UCI datasets. The performance of our approach was compared with those of the expectation-maximization algorithm, the support vector machine, and other state-of-the-art approaches. The experimental results demonstrated the effectiveness of our approach.