期刊文献+

基于非负矩阵分解的代价敏感特征选择

Cost-Sensitive Feature Selection based on Non-Negative Matrix Factorization
下载PDF
导出
摘要 数据降维可降低分析处理多维数据的复杂度和成本.特征选择是常见的数据降维方法.传统的特征选择算法更多关注算法的分类性能,忽略了对选择过程中产生的测试代价(Cost-test)的考虑.基于此提出一种新的基于非负分解的代价敏感特征选择方法(NmfCt).NmfCt算法构造的目标函数能够同时约束重建误差最小和测试代价最小,在对数据进行预处理降维的同时,不但能确保较好的分类正确率(Accuracy),而且还能保持较低的测试代价. Data dimension reduction can lower the complexity and cost of the analysis of multi-dimensional data, and feature selection is an effective method. Traditional feature selection methods tend to consider the classification performance of algorithm, meanwhile, ignore the cost-test in selecting. Therefore, according to the non-negative matrix factorization, a new cost sensitive feature selection method, namely NmfCt algorithm, is proposed. The ob- jective function constructed by NmfCt algorithm can let the reconstruction error in minimum and have a minimum testing Cost at the same time. NmfCt algorithm can not only ensure classification accuracy in data dimension reduc- tion, but also maintain the testing at low cost.
作者 周步芳 祝峰 ZHOU Bu-fang ZHU William(Lab of granular computing, Minnan Normal University, Zhangzhou, Fujian 363000, Chin)
出处 《烟台大学学报(自然科学与工程版)》 CAS 2017年第4期341-347,共7页 Journal of Yantai University(Natural Science and Engineering Edition)
基金 国家自然科学基金资助项目(61379049 61379089)
关键词 机器学习 代价敏感 特征选择 非负矩阵分解 machine learning cost sensitive feature selection non-negative matrix factorization
  • 相关文献

参考文献4

二级参考文献55

  • 1LEE M C.Using support vector machine with a hybrid feature selection method to the stock trend prediction[J].Expert Systems with Applications,2009,36(8):10896-10904.
  • 2MALDONADO S,WEBER R.A wrapper method for feature selection using support machines[J].Information Sciences,2009,179(13):2208-2217.
  • 3LIU Y,ZHENG Y F.FS_SFS:A novel feature selection method for support vector machines[J].Pattern Recognition,2006,39 (7):1333-1345.
  • 4HUA J P,TEMBE W D,DOUGHERTY E R.Performance of feature-selection methods in the classification of high-dimensian data[J].Pattern Recognition,2009,42(3):409-424.
  • 5GUNAL S,GEREK O N,ECE D G,et al.The search for optimal feature set in power quality event classification[J].Expert Systems with Applications,2009,36(7):10266-10273.
  • 6WIDODO A,YANG B S.Application of nonlinear feature extraction and support vector machines for fault diagnosis of induction motors[J].Expert Systems with Applications,2007,33(1):241-250.
  • 7GUYON I,ELISSEEFF A.An introduction to variable and feature selection[J].Machine Learning Research,2003,3:1157-1182.
  • 8TALAVERA L.An evaluation of filter and wrapper methods for feature selection in categorical clustering[C]// Proceedings of 6th International Symposium on Intelligent Data Analysis.Madrid:Springer,2005:440-451.
  • 9CHEN Y W,LIN C J Combining SVMs with various feature selection strategies[EB/OL].[2009-08-10].http://www.csie.ntu.edu.tw/-cjlin/papere/features.pdf.
  • 10VAPNIK V N.The nature of statistical learning theory[M].New York:Springer,1995.

共引文献63

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部