期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Length-Changeable Incremental Extreme Learning Machine 被引量:2
1
作者 You-Xi Wu Dong Liu He Jiang 《Journal of Computer Science & Technology》 SCIE EI CSCD 2017年第3期630-643,共14页
Extreme learning machine (ELM) is a learning algorithm for generalized single-hidden-layer feed-forward networks (SLFNs). In order to obtain a suitable network architecture, Incremental Extreme Learning Machine (... Extreme learning machine (ELM) is a learning algorithm for generalized single-hidden-layer feed-forward networks (SLFNs). In order to obtain a suitable network architecture, Incremental Extreme Learning Machine (I-ELM) is a sort of ELM constructing SLFNs by adding hidden nodes one by one. Although kinds of I-ELM-class algorithms were proposed to improve the convergence rate or to obtain minimal training error, they do not change the construction way of I-ELM or face the over-fitting risk. Making the testing error converge quickly and stably therefore becomes an important issue. In this paper, we proposed a new incremental ELM which is referred to as Length-Changeable Incremental Extreme Learning Machine (LCI-ELM). It allows more than one hidden node to be added to the network and the existing network will be regarded as a whole in output weights tuning. The output weights of newly added hidden nodes are determined using a partial error-minimizing method. We prove that an SLFN constructed using LCI-ELM has approximation capability on a universal compact input set as well as on a finite training set. Experimental results demonstrate that LCI-ELM achieves higher convergence rate as well as lower over-fitting risk than some competitive I-ELM-class algorithms. 展开更多
关键词 single-hidden-layer feed-forward network (SLFN) incremental extreme learning machine (I-ELM) random hidden node convergence rate universal approximation
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部