摘要
在自然语言处理中,由于神经网络的结构需要人工设计,容易导致复杂的神经网络结构中存在大量冗余.为了减少冗余,人们常采用剪枝等模型压缩方法,但是这类方法通过一些与训练过程无关的指标直接对模型进行裁剪时往往造成性能损失.因此探索了一种神经网络中神经元连接的自动学习方法,通过在训练中对神经元连接进行动态生长和删除的方法,可以更好地对网络连接进行动态操作,从而得到更紧凑、高效的网络结构.使用该方法在神经语言模型上进行自动生长和消去,在保证网络性能不变的前提下,网络规模可缩小49%.
In the field of natural language processing,the structure of the neural network requires manual design,which leads to a large amount of redundancy in the complex neural network structure.For the purpose of reducing the redundant model parameters,researchers often adopt model compression methods such as pruning.However,these methods directly compress the model by taking some indicators that are not related to the training process,resulting in the performance loss.This paper explores an automatic learning method of neural connection in neural network.This method can dynamically grow and delete the neuron connection during training,which can better operate the network connection dynamically,thus achieving more compact and efficient network structures.Using this method,we perform automatic growth and elimination on the neural language model,and the network scale can be further reduced by 49%while maintaining the original network performance.
作者
姜雨帆
李北
林野
李垠桥
肖桐
朱靖波
JIANG Yufan;LI Bei;LIN Ye;LI Yinqiao;XIAO Tong;ZHU Jingbo(Natural Language Processing Laboratory,School of Computer Science and Engineering,Northeastern University,Shenyang 110819,China)
出处
《厦门大学学报(自然科学版)》
CAS
CSCD
北大核心
2019年第2期225-230,共6页
Journal of Xiamen University:Natural Science
基金
国家自然科学基金(61432013
61732005
61876035)
中央高校基本科研业务费专项(N161604007)
辽宁省高等学校创新人才支持计划(LR20170606)
关键词
语言模型
神经元连接
剪枝
language model
neuron connection
pruning