期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Novel multi‐domain attention for abstractive summarisation
1
作者 chunxia qu Ling Lu +2 位作者 Aijuan Wang Wu Yang Yinong Chen 《CAAI Transactions on Intelligence Technology》 SCIE EI 2023年第3期796-806,共11页
The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary,and the summary generated by models lacks the cover of the subject ... The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary,and the summary generated by models lacks the cover of the subject of source document due to models'small perspective.In order to make up these disadvantages,a multi‐domain attention pointer(MDA‐Pointer)abstractive summarisation model is proposed in this work.First,the model uses bidirectional long short‐term memory to encode,respectively,the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level.Furthermore,the multi‐domain attention mechanism between the semantic representations and the summary word is established,and the proposed model can generate summary words under the proposed attention mechanism based on the words and sen-tences.Then,the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary,and the coverage mechanism is introduced,respectively,into word and sentence level to reduce the redundancy of sum-mary content.Finally,experiment validation is conducted on CNN/Daily Mail dataset.ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively,and the results verify the validation of model proposed by this paper. 展开更多
关键词 abstractive summarisation attention mechanism Bi‐LSTM coverage mechanism pointer network
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部