Extreme learning machine (ELM) is a learning algorithm for generalized single-hidden-layer feed-forward networks (SLFNs). In order to obtain a suitable network architecture, Incremental Extreme Learning Machine (...Extreme learning machine (ELM) is a learning algorithm for generalized single-hidden-layer feed-forward networks (SLFNs). In order to obtain a suitable network architecture, Incremental Extreme Learning Machine (I-ELM) is a sort of ELM constructing SLFNs by adding hidden nodes one by one. Although kinds of I-ELM-class algorithms were proposed to improve the convergence rate or to obtain minimal training error, they do not change the construction way of I-ELM or face the over-fitting risk. Making the testing error converge quickly and stably therefore becomes an important issue. In this paper, we proposed a new incremental ELM which is referred to as Length-Changeable Incremental Extreme Learning Machine (LCI-ELM). It allows more than one hidden node to be added to the network and the existing network will be regarded as a whole in output weights tuning. The output weights of newly added hidden nodes are determined using a partial error-minimizing method. We prove that an SLFN constructed using LCI-ELM has approximation capability on a universal compact input set as well as on a finite training set. Experimental results demonstrate that LCI-ELM achieves higher convergence rate as well as lower over-fitting risk than some competitive I-ELM-class algorithms.展开更多
目的研究干扰素(interferon,IFN)刺激基因Schlafen(SLFN)对乙型肝炎病毒(hepatitis B virus,HBV)复制的调控作用。方法首先用不同剂量的IFN-α处理肝癌细胞HepG248 h,或在同一剂量下处理不同时间,通过荧光定量PCR(q-PCR)检测SLFN家族基...目的研究干扰素(interferon,IFN)刺激基因Schlafen(SLFN)对乙型肝炎病毒(hepatitis B virus,HBV)复制的调控作用。方法首先用不同剂量的IFN-α处理肝癌细胞HepG248 h,或在同一剂量下处理不同时间,通过荧光定量PCR(q-PCR)检测SLFN家族基因转录的表达水平;接着在TCGA数据库采用t检验分析HBV感染的肝癌组织与癌旁组织中SLFN基因家族表达差异;进一步通过siRNA干扰和过表达技术,结合q-PCR、Western blot与Southern blot分析病毒复制水平的变化。结果在SLFN基因家族中,只有SLFN5受IFN-α诱导,且呈浓度梯度依赖性;TCGA数据库分析表明在SLFN基因家族中,SLFN5和SLFN11在HBV感染的肝癌组织和癌旁组织中表达明显有差异;在HepG2肝癌细胞中,siRNA干扰SLFN5,HBV复制水平无明显变化;而在Huh7肝癌细胞中,干扰SLFN11后,HBV复制水平增加了1.79倍,而病毒HBc蛋白没有明显变化;在HepG2细胞中过表达SLFN11后,HBV复制中间体水平下降34.67%。结论在肝癌细胞HepG2中,SLFN5受IFN-α诱导,但缺少调控HBV复制的作用;而SLFN11的表达与HBV复制相关,能抑制HBV核心颗粒DNA复制。展开更多
基金This work was partially supported by the National Natural Science Foundation of China under Grant Nos. 61673159 and 61370144, and the Natural Science Foundation of Hebei Province of China under Grant No. F2016202145.
文摘Extreme learning machine (ELM) is a learning algorithm for generalized single-hidden-layer feed-forward networks (SLFNs). In order to obtain a suitable network architecture, Incremental Extreme Learning Machine (I-ELM) is a sort of ELM constructing SLFNs by adding hidden nodes one by one. Although kinds of I-ELM-class algorithms were proposed to improve the convergence rate or to obtain minimal training error, they do not change the construction way of I-ELM or face the over-fitting risk. Making the testing error converge quickly and stably therefore becomes an important issue. In this paper, we proposed a new incremental ELM which is referred to as Length-Changeable Incremental Extreme Learning Machine (LCI-ELM). It allows more than one hidden node to be added to the network and the existing network will be regarded as a whole in output weights tuning. The output weights of newly added hidden nodes are determined using a partial error-minimizing method. We prove that an SLFN constructed using LCI-ELM has approximation capability on a universal compact input set as well as on a finite training set. Experimental results demonstrate that LCI-ELM achieves higher convergence rate as well as lower over-fitting risk than some competitive I-ELM-class algorithms.