摘要
本文引入信息熵作为惩罚函数,加入到神经网络的代价函数中.经过训练获得了更有组织的隐层神经元激励模式,每个输入样本仅使隐层少数神经元产生响应.经过本文提出的裁剪算法的裁剪后,减小了网络的规模,提高了神经网络的泛化能力和计算效率.文中的ECG分类也证实了这一结果.
In this paper, information entropy is introduced as penalty function and imposed to the backpropagation cost function. After training, more organized hidden unit activation patterns are obtained and few hidden units respond for each input sample. The scale of the neural network is reduced ther using the pruningmethod proposed in this paper,and its generalization performance and computational efficiency are improved atthe same time.
出处
《电子学报》
EI
CAS
CSCD
北大核心
1997年第10期44-47,共4页
Acta Electronica Sinica
关键词
信息熵
神经网络
泛化性能
ECG分类
Information entropy
Neural network
Pruning
Generalization performance
ECG classification