摘要
是指对文本信息内容进行概括、提取主要内容进而形成摘要的过程。现有的文本摘要模型通常将内容选择和摘要生成独立分析,虽然能够有效提高句子压缩和融合的性能,但是在抽取过程中会丢失部分文本信息,导致准确率降低。基于预训练模型和Transformer结构的文档级句子编码器,提出一种结合内容抽取与摘要生成的分段式摘要模型。采用BERT模型对大量语料进行自监督学习,获得包含丰富语义信息的词表示。基于Transformer结构,通过全连接网络分类器将每个句子分成3类标签,抽取每句摘要对应的原文句子集合。利用指针生成器网络对原文句子集合进行压缩,将多个句子集合生成单句摘要,缩短输出序列和输入序列的长度。实验结果表明,相比直接生成摘要全文,该模型在生成句子上ROUGE-1、ROUGE-2和ROUGE-L的F1平均值提高了1.69个百分点,能够有效提高生成句子的准确率。
Text summary refers to the process of summarization the content of text information,extracting the relevant content,and then formulating a summarization.Existing text summarization models usually analyze content selection and summarization generation separately.Although they can effectively improve the performance of sentence compression and fusion models,some text information will be lost in the extraction process,resulting in reduced accuracy.Based on the pre-training model and the document level sentence encoder with Transformer structure,a segmented summarization model combining content extraction and summarization generation is proposed.The BERT model is used to conduct selfsupervised learning on a large corpus to arrive at a word representation containing rich semantic information.Based on the Transformer structure,each sentence is divided into three types of tags through the fully connected network classifier,and the original sentence set corresponding to each sentence summarization extracted.The Pointer-Generator(PG) network is used to compress the original sentence set,and multiple sentence sets generated into a single sentence summarization to shorten the length of the output and input sequences.The experimental results show that compared with the direct generation of summarization full text,the F1 average value of ROUGE-1,ROUGE-2,and ROUGE-L in generating sentences increased by 1.69 percentage points,which can effectively improve the accuracy of generating sentences.
作者
王刚
孙媛媛
陈彦光
林鸿飞
WANG Gang;SUN Yuanyuan;CHEN Yanguang;LIN Hongfei(School of Computer Science and Technology,Dalian University of Technology,Dalian,Liaoning 116024,China)
出处
《计算机工程》
CAS
CSCD
北大核心
2022年第6期288-294,共7页
Computer Engineering
基金
国家重点研发计划(2018YFC0830604)。