期刊文献+

一种新型软件缺陷预测模型研究 被引量:1

New prediction model for software defect-proneness
下载PDF
导出
摘要 对软件度量元的选择问题是软件缺陷预测中的重要研究内容之一。文中通过采用互信息(MI)来计算度量元与度量元、度量元与目标类别之间的相关性,提出了信息损失最小准则,并将该准则作为Filter模型中消除冗余度量元的依据,然后结合现有的Wrapper模型来选择最终对分类有帮助的度量元集合。该方法既能减轻Wrapper模型高计算成本的负担,又能弥补单纯使用Filter模型对软件缺陷的预测能力相对较弱的缺点。实验表明:该方法在召回率、F-measure值上都有明显的提高,具有一定的有效性和实用性。 Software metric selection problem is one of the important research topics of software defectproneness prediction. In this study,the relevance among different software metrics,and the software metrics and the decision of the software modules is measured by mutual information( MI),followed by a minimum information loss criterion. Furthermore,based on the criterion,the redundant metrics are removed in the Filter model. And by combining the Wrapper model,some certain metrics that are helpful for the final classification are selected. The proposed method can lessen the high computational burden of Wrapper model and obtain better prediction performance than Filter model. The simulation and the analysis show that the method has significant improvenent in recall rate and F-measure,and thus the method is effective and efficient for software defect prediction.
作者 韩璐 荆晓远
出处 《南京邮电大学学报(自然科学版)》 北大核心 2015年第1期95-101,共7页 Journal of Nanjing University of Posts and Telecommunications:Natural Science Edition
基金 江苏省研究生培养创新工程(CXLX11_0418) 江苏省自然科学基金青年(BK20140888)资助项目
关键词 软件度量元选择 FILTER WRAPPER 软件缺陷预测 信息损失最小 software metrics selection Filter Wrapper software defect prediction minimum information loss
  • 相关文献

参考文献1

二级参考文献10

  • 1Pudil P,Ferri F J,Novovicova J,et al.Floating search methods for feature selection with nonmonotonic criterion functions[A].Proceedings of the Twelveth International Conference on Pattern Recognition[C].Los Alamitos,CA:IEEE Computer Society Press,1994.
  • 2Kononenko I.Estimating attributes:analysis and extensions of relieF[A].Proceedings of the 1994 European Conference on Machine Learning[C].Catania,Italy:Springer Verlag,1994:171-182.
  • 3Robnik-Sikonja M,Kononenko I.Theoretical and empirical analysis of relief and reliefF[J].Machine Learning,2003,53(1):23-69.
  • 4P M Narendra,K Fukunaga.A branch and bound algorithm for feature subset selection[J].IEEE Transactions on Computers,1977,C-26(9):917-922.
  • 5Dash M,LIU H.Feature selection for classification[J].Intelligent Data Analysis,1997(1):131-156.
  • 6Amaldi E,Kann V.On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems[J].Theoretical Computer Science,1998(209):237-260.
  • 7Kohavi R,John G H.Wrappers for feature subset selection[J].Artificial Intelligence,1997,97(1-2):273-324.
  • 8QIAN Y,Liang J,Wei W.Accelerating incomplete feature selection[A].Proceedings of the Eighth International Conference on Machine Learning and Cybernetics[C].Los Alamitos,CA:IEEE Computer Society Press,2009:350-358.
  • 9GUO B,Damper R I,Gunn S R,et al.A fast separability-based feature-selection method for high-dimensional remotely sensed image classification[J].Pattern Recognition,2008,41(5):1653-1662.
  • 10王博,贾焰,田李.基于类标号扩展的半监督特征选择算法[J].计算机科学,2009,36(10):189-191. 被引量:6

共引文献7

同被引文献7

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部