期刊文献+

适合大规模数据集且基于LLM的0阶TSK模糊分类器 被引量:2

Zero-order TSK fuzzy classifier based on LLM for large-scale data sets
原文传递
导出
摘要 针对传统分类器的泛化性能差、可解释性及学习效率低等问题,提出0阶TSK-FC模糊分类器.为了将该分类器应用到大规模数据的分类中,提出增量式0阶TSK-IFC模糊分类器,采用增量式模糊聚类算法(IFCM(c+p))训练模糊规则参数并通过适当的矩阵变换提升参数学习效率.仿真实验表明,与FCPM-IRLS模糊分类器、径向基函数神经网络相比,所提出的模糊分类器在不同规模数据集中均能保持很好的性能,且TSK-IFC模糊分类器在大规模数据分类中尤为突出. In order to overcome the shortcoming that traditional classifiers cannot achieve satisfactory generalization performance, good interpretability and fast learning efficiency for datasets, the zero-order TSK fuzzy classifier called TSKFC is proposed to solve the classification problem of middle-scale datasets. In order to make the TSK-FC suitable for largescale data sets, its incremental version called TSK-IFC is developed, in which the incremental fuzzy clustering algorithm called incremental fuzzy(c + p)-means clustering(IFCM(c + p)) is used to train antecedent parameters of fuzzy rules while fast consequent parameter learning is achieved through an appropriate matrix computation trick for the least learning machine. The proposed fuzzy classifiers, the TSK-FC and the TSK-IFC are experimentally compared with the conventional fuzzy classifier called FCPM-IRLS and the RBF neural network, and the results show the power of the proposed fuzzy classifiers, especially the great applicability of the TSK-IFC for large-scale data sets.
作者 李滔 王士同
出处 《控制与决策》 EI CSCD 北大核心 2017年第1期21-30,共10页 Control and Decision
基金 国家自然科学基金项目(61170122 61272210) 江苏省自然科学基金项目(BK20130155)
关键词 TSK-FC TSK-IFC 最小学习机 TSK型模糊分类器 大规模数据集 TSK-FC TSK-IFC least learning machine TSK fuzzy classifier large-scale data sets
  • 相关文献

参考文献5

二级参考文献69

  • 1彭鹏菲,杨露菁,张青贵.General Fuzzy Min-Max神经网络的改进与应用[J].武汉理工大学学报,2004,26(10):87-89. 被引量:2
  • 2侯丽云,赵强,路立平,汲淑丽.基于模糊控制的智能大厦空调系统设计[J].山东建筑工程学院学报,2005,20(2):74-77. 被引量:3
  • 3SCHOLKOPF B, PLATT J, SHAWE-TAYLOR J, et al.. Estimating the support of high-dimen- sional distribution [J]. Neural Computation, 2001, 13 : 1443-1471.
  • 4TAX D, DUIN R, Support vector data description [J]. Machine Learning, 2004 (54) : 45-66.
  • 5WEI X K, HUANG G B, LI Y H. Mahalanobis el- lipsoidal learning machine for one class classification [C]. Proc. of the 6th Int. Conf. on Machine learning and cybernetics. Los Alamitos: IEEE Computer Society, 2007: 3528-3533.
  • 6VAPNIK V. The Nature of Statistical Learning Theory [M]. New York: Springer-Verlag, 1995.
  • 7MAHESH P, GILES M. Feature selection for clas- sification of hyper spectral data by SVM [J]. IEEE Trans. on Geoscience and Remote Sensing, 2010, 48 (5): 2297-2307.
  • 8TSANG I W, KWOK J T, CHEUNG P M. Core vector machines: fast svm training on very large da- ta sets [J]. Journal of Machine Learning Re-search, 2005, 6: 363-392.
  • 9SHIVASWAMY P, JEBARA T. Maximum rela- tive margin and data-dependent regularization [J]. Journal of Machine Learning Research, 2010 (11): 747-788.
  • 10JUSZCZAK P. Learning to recognize : A study on one-class classification and active learning [D]. Delft.. Delft University of Technology, 2006.

共引文献29

同被引文献11

引证文献2

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部