期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
A Hybrid Method of Extractive Text Summarization Based on Deep Learning and Graph Ranking Algorithms 被引量:1
1
作者 SHI Hui WANG Tiexin 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI CSCD 2022年第S01期158-165,共8页
In the era of Big Data,we are faced with an inevitable and challenging problem of“overload information”.To alleviate this problem,it is important to use effective automatic text summarization techniques to obtain th... In the era of Big Data,we are faced with an inevitable and challenging problem of“overload information”.To alleviate this problem,it is important to use effective automatic text summarization techniques to obtain the key information quickly and efficiently from the huge amount of text.In this paper,we propose a hybrid method of extractive text summarization based on deep learning and graph ranking algorithms(ETSDG).In this method,a pre-trained deep learning model is designed to yield useful sentence embeddings.Given the association between sentences in raw documents,a traditional LexRank algorithm with fine-tuning is adopted fin ETSDG.In order to improve the performance of the extractive text summarization method,we further integrate the traditional LexRank algorithm with deep learning.Testing results on the data set DUC2004 show that ETSDG has better performance in ROUGE metrics compared with certain benchmark methods. 展开更多
关键词 extractive text summarization deep learning sentence embeddings LexRank
下载PDF
语言使用中的记忆负担最小化机制 被引量:4
2
作者 冯志伟 《日语学习与研究》 CSSCI 2022年第1期1-19,共19页
记忆负担最小化是语言使用中一个普遍性的原则。语言学家们早就观察到语言活动中记忆负担有限的规律,提出了语言的经济原则理论。本文根据语言研究的有关成果和文献,阐述在人类语言的语音、词汇、语法使用中和在自然语言处理的神经网络... 记忆负担最小化是语言使用中一个普遍性的原则。语言学家们早就观察到语言活动中记忆负担有限的规律,提出了语言的经济原则理论。本文根据语言研究的有关成果和文献,阐述在人类语言的语音、词汇、语法使用中和在自然语言处理的神经网络应用中的这种记忆负担最小化的机制。 展开更多
关键词 记忆负担最小化 声源滤波器模型 齐普夫定律 生词增幅递减律 句子深度 长尾分布
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部