期刊文献+

基于字符级循环网络的查询意图识别模型 被引量:4

Query Intention Recognition Model Based on Character Level Cyclic Network
下载PDF
导出
摘要 以特征模板为主的查询意图识别方法存在人工抽取特征繁琐,并且难以捕捉文本语义信息的问题。为此,基于字符级循环网络,提出一种新的查询意图识别模型。为能有效提取句子深层次语义特征,减少长距离信息依赖的限制,使用长短时记忆网络(LSTM)作为神经网络线性变换层,同时增加一层反向LSTM抽取字符的将来信息特征。使用原始汉字直接作为模型的输入,避免分词结果不准确带来的错误传导问题,利用字符的分布向量表示方法,提高句子语义特征的获取。实验结果表明,该方法整体准确率达到90.7%,相比特征模板方法有所提升,能提高用户查询意图的分类性能。 Intention recognition methods, which are mainly based on feature template, have complicated hand-crafted feature extraction process and are difficult to capture semantic information of texts. Aiming at this problem, this paper proposes a new query intention identification model based on character level recurrent network. In order to effectively extract deep semantic features of a sentence and decrease long distance information dependent constraints,this paper uses Long Short-Term Memory Neural Network(LSTM) as a linear transformation of neural network layer, and uses a reverse LSTM layer to extract future information character. To avoid error propagation problem caused by inaccurate word segmentation results, it uses Chinese characters as inputs of the model, and uses distributed representation of characters to improve extractions of semantic features of sentences. Experimental results show that the method has an accuracy of 90.7% , which is higher than the characteristics template method, and it can improve the classification performance of user query intention.
出处 《计算机工程》 CAS CSCD 北大核心 2017年第3期181-186,共6页 Computer Engineering
基金 国家自然科学基金(61202100)
关键词 查询意图 字符级 循环神经网络 记忆网络 词向量 query intention character level Recurrent Neural Network (RNN) memory network word vector
  • 相关文献

参考文献3

二级参考文献18

  • 1余慧佳,刘奕群,张敏,茹立云,马少平.基于大规模日志分析的搜索引擎用户行为分析[J].中文信息学报,2007,21(1):109-114. 被引量:116
  • 2MarkoffJ. How many computers to identify a cat?[NJ The New York Times, 2012-06-25.
  • 3MarkoffJ. Scientists see promise in deep-learning programs[NJ. The New York Times, 2012-11-23.
  • 4李彦宏.2012百度年会主题报告:相信技术的力量[R].北京:百度,2013.
  • 510 Breakthrough Technologies 2013[N]. MIT Technology Review, 2013-04-23.
  • 6Rumelhart D, Hinton G, Williams R. Learning representations by back-propagating errors[J]. Nature. 1986, 323(6088): 533-536.
  • 7Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks[J]. Science. 2006, 313(504). Doi: 10. 1l26/science. 1127647.
  • 8Dahl G. Yu Dong, Deng u, et a1. Context-dependent pre?trained deep neural networks for large vocabulary speech recognition[J]. IEEE Trans on Audio, Speech, and Language Processing. 2012, 20 (1): 30-42.
  • 9Jaitly N. Nguyen P, Nguyen A, et a1. Application of pretrained deep neural networks to large vocabulary speech recognition[CJ //Proc of Interspeech , Grenoble, France: International Speech Communication Association, 2012.
  • 10LeCun y, Boser B, DenkerJ S. et a1. Backpropagation applied to handwritten zip code recognition[J]. Neural Computation, 1989, I: 541-551.

共引文献603

同被引文献21

引证文献4

二级引证文献23

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部