期刊文献+

Novel multi‐domain attention for abstractive summarisation

下载PDF
导出
摘要 The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary,and the summary generated by models lacks the cover of the subject of source document due to models'small perspective.In order to make up these disadvantages,a multi‐domain attention pointer(MDA‐Pointer)abstractive summarisation model is proposed in this work.First,the model uses bidirectional long short‐term memory to encode,respectively,the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level.Furthermore,the multi‐domain attention mechanism between the semantic representations and the summary word is established,and the proposed model can generate summary words under the proposed attention mechanism based on the words and sen-tences.Then,the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary,and the coverage mechanism is introduced,respectively,into word and sentence level to reduce the redundancy of sum-mary content.Finally,experiment validation is conducted on CNN/Daily Mail dataset.ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively,and the results verify the validation of model proposed by this paper.
出处 《CAAI Transactions on Intelligence Technology》 SCIE EI 2023年第3期796-806,共11页 智能技术学报(英文)
基金 supported by the National Social Science Foundation of China(2017CG29) the Science and Technology Research Project of Chongqing Municipal Education Commission(2019CJ50) the Natural Science Foundation of Chongqing(2017CC29).
  • 相关文献

参考文献1

二级参考文献6

共引文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部