摘要
循环神经网络语言模型能解决传统N-gram模型中存在的数据稀疏和维数灾难问题,但仍缺乏对长距离信息的描述能力.为此文中提出一种基于词向量特征的循环神经网络语言模型改进方法.该方法在输入层中增加特征层,改进模型结构.在模型训练时,通过特征层加入上下文词向量,增强网络对长距离信息约束的学习能力.实验表明,文中方法能有效提高语言模型的性能.
The recurrent neural network language model( RNNLM) solves the problems of data sparseness and dimensionality disaster in traditional N-gram models. However, the original RNNLM is still lack of long dependence due to the vanishing gradient problem. In this paper, an improved method based on contextual word vectors is proposed for RNNLM. To improve the structure of models, a feature layer is added into the input layer. Contextual word vectors are added into the model with feature layer to reinforce the ability of learning long-distance information during the training. Experimental results show that the proposed method effectively improves the performance of RNNLM.
出处
《模式识别与人工智能》
EI
CSCD
北大核心
2015年第4期299-305,共7页
Pattern Recognition and Artificial Intelligence
基金
国家863计划项目(No.2012AA011603)
国家自然科学基金项目(No.61175017)资助
关键词
语音识别
语言模型
循环神经网络
词向量
Speech Recognition
Language Model
Recurrent Neural Network
Word Vector