期刊文献+

基于PSO的属性选择方法 被引量:2

A PSO-Based Attribute Selection Method
下载PDF
导出
摘要 为了减少实例对属性选择的影响,本文提出了基于PSO的属性选择方法。该方法主要利用PSO算法求实例群的最优熵值,获得相应的属性阈值,并利用阈值确定属性的优先级,最后按优先级进行选择。在实验中,通过确定本体中概念属性的优先级来验证所提算法的性能。实验结果表明,该方法减少了对实例的依赖,计算量也相对减少。 In order to reduce the instance influence,a PSO-based choosing method is proposed in this paper.This method mainly uses the PSO algorithm to solve the optimal entropy of instances and obtain the corresponding attribute threshold value.According to the threshold,the attribute priority is determined.Finally,the attribute is chosen by priority.In our experiment,we specify the concept attribute priority in ontolgoy and verify the algorithm performance.The experimental results show that this algorithm reduces the dependency on instances and improves the accuracy.In addition,the computation quantity is reduced.
出处 《计算机工程与科学》 CSCD 北大核心 2011年第6期150-153,共4页 Computer Engineering & Science
基金 山东省教育厅科研发展计划资助项目(J09LG29) 聊城大学重点科研项目(X0810015)
关键词 微粒群算法 属性优先级 信息增益 本体 PSO attribute priority information gain ontology
  • 相关文献

参考文献6

二级参考文献23

  • 1赵国涛,何钦铭.基于本体的异构文本分类系统[J].计算机工程,2004,30(21):123-125. 被引量:4
  • 2陈彬,洪家荣,王亚东.最优特征子集选择问题[J].计算机学报,1997,20(2):133-138. 被引量:96
  • 3曹泽文,钱杰,张维明,邓苏.一种综合的概念相似度计算方法[J].计算机科学,2007,34(3):174-175. 被引量:35
  • 4IanH Witten, Eibe Frank. Data Ming practical Machine Learning Tools and Techniques with Java Implementations[ M]. Beijing: China machine press,2003:141 - 147.
  • 5Vapnik V. Statistical Learning Theory[ M]. New York: John Wiley and Sons, 1998:254 - 263.
  • 6Guyon I, Westen J, BamhiU S, at al. Gene Selection for Cancer Classification using Support Vector Machines [ J]. Machine Learning, 2002,46( 1 - 3) :389 - 422.
  • 7Westen J, Mukherjees, ChapeUeo, et al. Feature selection for SVMs[A]. Advances in Neural Information Processing Systems, Vol. 13[M]. Cambridge, Massachusetts: MIT Press,2000. :668 - 674.
  • 8Grandvalet Y, Canu S. Adaptive scaling for feature selection in svms[ A]. Advances in Neural Information Processing Systems, Vol. 15[ M]. Cambridge, Massachusetts: MIT Press, 2003 : 553 - 560.
  • 9Friedman N, Geiger D, Goldszmid M. Bayesian network classifiers [J] .Machine Leaming. 1997, 29(2 - 3): 131 - 163.
  • 10Tan Pang- Ning, Michael Steinbach, Vipin Kumar. Introduction to Data Mining[M] .New York: Posts&Telecom Press,2006:279 - 301.

共引文献156

同被引文献19

  • 1Witten I H,Frank E.数据挖掘实用机器学习技术[M].北京:机械工业出版社,2006
  • 2Han J,Mickeline K,Pel J.数据挖掘:概念与技术[M].范明,孟小峰,译.北京:机械工业出版社,2012.
  • 3Kira K,Rendell L A.A practical approach to feature selection[C]//In Machine Learning Proceedings of the Ninth International Conference.San Francisco:Morgan Kaufmann,1992:250-256.
  • 4Modrzejewski M.Feature selection using rough sets theory[C]//European Conference on Machine Learning.Berlin:Springer Verlag,1993:213-226.
  • 5Liu H,Setiono R.A probabilistic approach to feature selection-a filter solution[C]//Proceedings of International Conference on Machine Learning.San Francisco:Morgan Kaufmann,1996:419-424.
  • 6Hall M A.Correlation-based feature selection for machine learning[D].Hamilton:The University of Waikato,1999.
  • 7Hall M A.Correlation-based feature selection for discrete and numeric class machine learning[C]//the 17th International Conference on Machine Learning.San Francisco:Morgan Kaufmann,2000:359-366.
  • 8UCI机器学习库[EB/OL].[2013-4-11].http://archive.ics.uci.edu/ml/datasts.html/.
  • 9Yang,Guang,Lin,Zhong-Yi,Chang,Yu-Xin,etc.Comparative analysis on feature selection based Bayesian text classification[].nd International Conference on Computer Science and Network TechnologyICCSNT.2012
  • 10Ranjit Abraham,Jay B Simha,Iyengar S.S.Medical datamining with a new algorithm for Feature Selection and Na?ve Bayesian Classifi er[].Information Technology th International Conference.2007

引证文献2

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部